Project import generated by Copybara.
PiperOrigin-RevId: 263982686
| 
						 | 
				
			
			@ -20,7 +20,7 @@ Check out the [Examples page](https://mediapipe.readthedocs.io/en/latest/example
 | 
			
		|||
A web-based visualizer is hosted on [viz.mediapipe.dev](https://viz.mediapipe.dev/). Please also see instructions [here](mediapipe/docs/visualizer.md).
 | 
			
		||||
 | 
			
		||||
## Community forum
 | 
			
		||||
*  [discuss](https://groups.google.com/forum/#!forum/mediapipe) - General community discussion around MediaPipe
 | 
			
		||||
*  [Discuss](https://groups.google.com/forum/#!forum/mediapipe) - General community discussion around MediaPipe
 | 
			
		||||
 | 
			
		||||
## Publications
 | 
			
		||||
* [MediaPipe: A Framework for Building Perception Pipelines](https://arxiv.org/abs/1906.08172)
 | 
			
		||||
| 
						 | 
				
			
			@ -29,7 +29,7 @@ A web-based visualizer is hosted on [viz.mediapipe.dev](https://viz.mediapipe.de
 | 
			
		|||
[Open sourced at CVPR 2019](https://sites.google.com/corp/view/perception-cv4arvr/mediapipe) on June 17~20 in Long Beach, CA
 | 
			
		||||
 | 
			
		||||
## Alpha Disclaimer
 | 
			
		||||
MediaPipe is currently in alpha for v0.5. We are still making breaking API changes and expect to get to stable API by v1.0.
 | 
			
		||||
MediaPipe is currently in alpha for v0.6. We are still making breaking API changes and expect to get to stable API by v1.0.
 | 
			
		||||
 | 
			
		||||
## Contributing
 | 
			
		||||
We welcome contributions. Please follow these [guidelines](./CONTRIBUTING.md).
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -1 +0,0 @@
 | 
			
		|||
theme: jekyll-theme-minimal
 | 
			
		||||
| 
						 | 
				
			
			@ -22,7 +22,7 @@ Android example users go through in detail. It teaches the following:
 | 
			
		|||
### Hello World! on iOS
 | 
			
		||||
 | 
			
		||||
[Hello World! on iOS](./hello_world_ios.md) is the iOS version of Sobel edge
 | 
			
		||||
detection example
 | 
			
		||||
detection example.
 | 
			
		||||
 | 
			
		||||
### Object Detection with GPU
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			@ -44,8 +44,9 @@ graphs can be easily adapted to run on CPU v.s. GPU.
 | 
			
		|||
[Face Detection with GPU](./face_detection_mobile_gpu.md) illustrates how to use
 | 
			
		||||
MediaPipe with a TFLite model for face detection in a GPU-accelerated pipeline.
 | 
			
		||||
The selfie face detection TFLite model is based on
 | 
			
		||||
["BlazeFace: Sub-millisecond Neural Face Detection on Mobile GPUs"](https://sites.google.com/view/perception-cv4arvr/blazeface).
 | 
			
		||||
[Model card](https://sites.google.com/corp/view/perception-cv4arvr/blazeface#h.p_21ojPZDx3cqq).
 | 
			
		||||
["BlazeFace: Sub-millisecond Neural Face Detection on Mobile GPUs"](https://sites.google.com/view/perception-cv4arvr/blazeface),
 | 
			
		||||
and model details are described in the
 | 
			
		||||
[model card](https://sites.google.com/corp/view/perception-cv4arvr/blazeface#h.p_21ojPZDx3cqq).
 | 
			
		||||
 | 
			
		||||
*   [Android](./face_detection_mobile_gpu.md#android)
 | 
			
		||||
*   [iOS](./face_detection_mobile_gpu.md#ios)
 | 
			
		||||
| 
						 | 
				
			
			@ -71,8 +72,9 @@ MediaPipe with a TFLite model for hand tracking in a GPU-accelerated pipeline.
 | 
			
		|||
[Hair Segmentation on GPU](./hair_segmentation_mobile_gpu.md) illustrates how to
 | 
			
		||||
use MediaPipe with a TFLite model for hair segmentation in a GPU-accelerated
 | 
			
		||||
pipeline. The selfie hair segmentation TFLite model is based on
 | 
			
		||||
["Real-time Hair segmentation and recoloring on Mobile GPUs"](https://sites.google.com/view/perception-cv4arvr/hair-segmentation).
 | 
			
		||||
[Model card](https://sites.google.com/corp/view/perception-cv4arvr/hair-segmentation#h.p_NimuO7PgHxlY).
 | 
			
		||||
["Real-time Hair segmentation and recoloring on Mobile GPUs"](https://sites.google.com/view/perception-cv4arvr/hair-segmentation),
 | 
			
		||||
and model details are described in the
 | 
			
		||||
[model card](https://sites.google.com/corp/view/perception-cv4arvr/hair-segmentation#h.p_NimuO7PgHxlY).
 | 
			
		||||
 | 
			
		||||
*   [Android](./hair_segmentation_mobile_gpu.md#android)
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -4,22 +4,22 @@ This doc focuses on the
 | 
			
		|||
[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/face_detection/face_detection_mobile_gpu.pbtxt)
 | 
			
		||||
that performs face detection with TensorFlow Lite on GPU.
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
## Android
 | 
			
		||||
 | 
			
		||||
Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
 | 
			
		||||
general instructions to develop an Android application that uses MediaPipe.
 | 
			
		||||
 | 
			
		||||
The graph is used in the
 | 
			
		||||
[Face Detection GPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/facedetectiongpu)
 | 
			
		||||
example app. To build the app, run:
 | 
			
		||||
The graph below is used in the
 | 
			
		||||
[Face Detection GPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/facedetectiongpu).
 | 
			
		||||
To build the app, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/facedetectiongpu
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
To further install the app on android device, run:
 | 
			
		||||
To further install the app on an Android device, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/facedetectiongpu/facedetectiongpu.apk
 | 
			
		||||
| 
						 | 
				
			
			@ -28,13 +28,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
 | 
			
		|||
## iOS
 | 
			
		||||
 | 
			
		||||
Please see [Hello World! in MediaPipe on iOS](hello_world_ios.md) for general
 | 
			
		||||
instructions to develop an iOS application that uses MediaPipe. The graph below
 | 
			
		||||
is used in the
 | 
			
		||||
[Face Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/facedetectiongpu).
 | 
			
		||||
instructions to develop an iOS application that uses MediaPipe.
 | 
			
		||||
 | 
			
		||||
To build the iOS app, please see the general
 | 
			
		||||
The graph below is used in the
 | 
			
		||||
[Face Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/facedetectiongpu).
 | 
			
		||||
To build the app, please see the general
 | 
			
		||||
[MediaPipe iOS app building and setup instructions](./mediapipe_ios_setup.md).
 | 
			
		||||
Specifically, run:
 | 
			
		||||
Specific to this example, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/facedetectiongpu:FaceDetectionGpuApp
 | 
			
		||||
| 
						 | 
				
			
			@ -42,11 +42,13 @@ bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/facedetectiongpu:Fa
 | 
			
		|||
 | 
			
		||||
## Graph
 | 
			
		||||
 | 
			
		||||
{width="400"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
To visualize the graph as shown above, copy the text specification of the graph
 | 
			
		||||
below and paste it into [MediaPipe Visualizer](https://viz.mediapipe.dev/).
 | 
			
		||||
 | 
			
		||||
[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/face_detection/face_detection_mobile_gpu.pbtxt)
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
# MediaPipe graph that performs face detection with TensorFlow Lite on GPU.
 | 
			
		||||
# Used in the example in
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -1,25 +1,25 @@
 | 
			
		|||
# Hair Segmentation (GPU)
 | 
			
		||||
 | 
			
		||||
This doc focuses on the
 | 
			
		||||
[below example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hair_segmentation/hair_segmentation_android_gpu.pbtxt)
 | 
			
		||||
[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hair_segmentation/hair_segmentation_mobile_gpu.pbtxt)
 | 
			
		||||
that performs hair segmentation with TensorFlow Lite on GPU.
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
## Android
 | 
			
		||||
 | 
			
		||||
Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
 | 
			
		||||
general instructions to develop an Android application that uses MediaPipe.
 | 
			
		||||
 | 
			
		||||
The graph is used in the
 | 
			
		||||
[Hair Segmentation GPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/hairsegmentationgpu)
 | 
			
		||||
example app. To build the app, run:
 | 
			
		||||
The graph below is used in the
 | 
			
		||||
[Hair Segmentation GPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/hairsegmentationgpu).
 | 
			
		||||
To build the app, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/hairsegmentationgpu
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
To further install the app on android device, run:
 | 
			
		||||
To further install the app on an Android device, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/hairsegmentationgpu/hairsegmentationgpu.apk
 | 
			
		||||
| 
						 | 
				
			
			@ -27,11 +27,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
 | 
			
		|||
 | 
			
		||||
## Graph
 | 
			
		||||
 | 
			
		||||
{width="600"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
To visualize the graph as shown above, copy the text specification of the graph
 | 
			
		||||
below and paste it into [MediaPipe Visualizer](https://viz.mediapipe.dev/).
 | 
			
		||||
 | 
			
		||||
[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hair_segmentation/hair_segmentation_mobile_gpu.pbtxt)
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
# MediaPipe graph that performs hair segmentation with TensorFlow Lite on GPU.
 | 
			
		||||
# Used in the example in
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -2,30 +2,36 @@
 | 
			
		|||
 | 
			
		||||
This doc focuses on the
 | 
			
		||||
[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_gpu.pbtxt)
 | 
			
		||||
that performs hand detection with TensorFlow Lite on GPU. This hand detection
 | 
			
		||||
example is related to
 | 
			
		||||
[hand tracking GPU example](./hand_tracking_mobile_gpu.md). Here is the
 | 
			
		||||
[model card](https://mediapipe.page.link/handmc) for hand detection.
 | 
			
		||||
that performs hand detection with TensorFlow Lite on GPU. It is related to the
 | 
			
		||||
[hand tracking example](./hand_tracking_mobile_gpu.md).
 | 
			
		||||
 | 
			
		||||
For overall context on hand detection and hand tracking, please read
 | 
			
		||||
[this Google AI blog post](https://mediapipe.page.link/handgoogleaiblog).
 | 
			
		||||
For overall context on hand detection and hand tracking, please read this
 | 
			
		||||
[Google AI Blog post](https://mediapipe.page.link/handgoogleaiblog).
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
In the visualization above, green boxes represent the results of palm detection,
 | 
			
		||||
and the red box represents the extended hand rectangle designed to cover the
 | 
			
		||||
entire hand. The palm detection ML model (see also
 | 
			
		||||
[model card](https://mediapipe.page.link/handmc)) supports detection of multiple
 | 
			
		||||
palms, and this example selects only the one with the highest detection
 | 
			
		||||
confidence score to generate the hand rectangle, to be further utilized in the
 | 
			
		||||
[hand tracking example](./hand_tracking_mobile_gpu.md).
 | 
			
		||||
 | 
			
		||||
## Android
 | 
			
		||||
 | 
			
		||||
Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
 | 
			
		||||
general instructions to develop an Android application that uses MediaPipe.
 | 
			
		||||
 | 
			
		||||
The graph is used in the
 | 
			
		||||
[Hand Detection GPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu)
 | 
			
		||||
example app. To build the app, run:
 | 
			
		||||
The graph below is used in the
 | 
			
		||||
[Hand Detection GPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu).
 | 
			
		||||
To build the app, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
To further install the app on android device, run:
 | 
			
		||||
To further install the app on an Android device, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/handdetectiongpu.apk
 | 
			
		||||
| 
						 | 
				
			
			@ -34,13 +40,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
 | 
			
		|||
## iOS
 | 
			
		||||
 | 
			
		||||
Please see [Hello World! in MediaPipe on iOS](hello_world_ios.md) for general
 | 
			
		||||
instructions to develop an iOS application that uses MediaPipe. The graph below
 | 
			
		||||
is used in the
 | 
			
		||||
[Hand Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/handdetectiongpu)
 | 
			
		||||
instructions to develop an iOS application that uses MediaPipe.
 | 
			
		||||
 | 
			
		||||
To build the iOS app, please see the general
 | 
			
		||||
The graph below is used in the
 | 
			
		||||
[Hand Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/handdetectiongpu).
 | 
			
		||||
To build the app, please see the general
 | 
			
		||||
[MediaPipe iOS app building and setup instructions](./mediapipe_ios_setup.md).
 | 
			
		||||
Specifically, run:
 | 
			
		||||
Specific to this example, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/handdetectiongpu:HandDetectionGpuApp
 | 
			
		||||
| 
						 | 
				
			
			@ -48,17 +54,18 @@ bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/handdetectiongpu:Ha
 | 
			
		|||
 | 
			
		||||
## Graph
 | 
			
		||||
 | 
			
		||||
The hand detection graph is
 | 
			
		||||
[hand_detection_mobile.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_mobile.pbtxt)
 | 
			
		||||
and it includes a [HandDetectionSubgraph](./framework_concepts.md#subgraph) with
 | 
			
		||||
filename
 | 
			
		||||
[hand_detection_gpu.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_gpu.pbtxt)
 | 
			
		||||
shown as a box called `HandDetection` in purple
 | 
			
		||||
The hand detection [main graph](#main-graph) internally utilizes a
 | 
			
		||||
[hand detection subgraph](#hand-detection-subgraph). The subgraph shows up in
 | 
			
		||||
the main graph visualization as the `HandDetection` node colored in purple, and
 | 
			
		||||
the subgraph itself can also be visualized just like a regular graph. For more
 | 
			
		||||
information on how to visualize a graph that includes subgraphs, see
 | 
			
		||||
[visualizing subgraphs](./visualizer.md#visualizing-subgraphs).
 | 
			
		||||
 | 
			
		||||
For more information on how to visualize a graph that includes subgraphs, see
 | 
			
		||||
[subgraph documentation](./visualizer.md#visualizing-subgraphs) for Visualizer.
 | 
			
		||||
### Main Graph
 | 
			
		||||
 | 
			
		||||
{width="500"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_mobile.pbtxt)
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
# MediaPipe graph that performs hand detection with TensorFlow Lite on GPU.
 | 
			
		||||
| 
						 | 
				
			
			@ -125,9 +132,15 @@ node {
 | 
			
		|||
}
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
{width="500"}
 | 
			
		||||
### Hand Detection Subgraph
 | 
			
		||||
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_gpu.pbtxt)
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
# MediaPipe hand detection subgraph.
 | 
			
		||||
 | 
			
		||||
type: "HandDetectionSubgraph"
 | 
			
		||||
 | 
			
		||||
input_stream: "input_video"
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -1,32 +1,41 @@
 | 
			
		|||
# Hand Tracking (GPU)
 | 
			
		||||
 | 
			
		||||
This doc focuses on the
 | 
			
		||||
[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_android_gpu.pbtxt)
 | 
			
		||||
that performs hand tracking with TensorFlow Lite on GPU. This hand tracking
 | 
			
		||||
example is related to
 | 
			
		||||
[hand detection GPU example](./hand_detection_mobile_gpu.md). We recommend users
 | 
			
		||||
to review the hand detection GPU example first. Here is the
 | 
			
		||||
[model card](https://mediapipe.page.link/handmc) for hand tracking.
 | 
			
		||||
[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_tracking_mobile.pbtxt)
 | 
			
		||||
that performs hand tracking with TensorFlow Lite on GPU. It is related to the
 | 
			
		||||
[hand detection example](./hand_detection_mobile_gpu.md), and we recommend users
 | 
			
		||||
to review the hand detection example first.
 | 
			
		||||
 | 
			
		||||
For overall context on hand detection and hand tracking, please read
 | 
			
		||||
[this Google AI blog post](https://mediapipe.page.link/handgoogleaiblog).
 | 
			
		||||
For overall context on hand detection and hand tracking, please read this
 | 
			
		||||
[Google AI Blog post](https://mediapipe.page.link/handgoogleaiblog).
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
In the visualization above, the red dots represent the localized hand landmarks,
 | 
			
		||||
and the green lines are simply connections between selected landmark pairs for
 | 
			
		||||
visualization of the hand skeleton. The red box represents a hand rectangle that
 | 
			
		||||
covers the entire hand, derived either from hand detection (see
 | 
			
		||||
[hand detection example](./hand_detection_mobile_gpu.md)) or from the pervious
 | 
			
		||||
round of hand landmark localization using an ML model (see also
 | 
			
		||||
[model card](https://mediapipe.page.link/handmc)). Hand landmark localization is
 | 
			
		||||
performed only within the hand rectangle for computational efficiency and
 | 
			
		||||
accuracy, and hand detection is only invoked when landmark localization could
 | 
			
		||||
not identify hand presence in the previous iteration.
 | 
			
		||||
 | 
			
		||||
## Android
 | 
			
		||||
 | 
			
		||||
Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
 | 
			
		||||
general instructions to develop an Android application that uses MediaPipe.
 | 
			
		||||
 | 
			
		||||
The graph is used in the
 | 
			
		||||
[Hand Tracking GPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu)
 | 
			
		||||
example app. To build the app, run:
 | 
			
		||||
The graph below is used in the
 | 
			
		||||
[Hand Tracking GPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu).
 | 
			
		||||
To build the app, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
To further install the app on android device, run:
 | 
			
		||||
To further install the app on an Android device, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/handtrackinggpu.apk
 | 
			
		||||
| 
						 | 
				
			
			@ -35,13 +44,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
 | 
			
		|||
## iOS
 | 
			
		||||
 | 
			
		||||
Please see [Hello World! in MediaPipe on iOS](hello_world_ios.md) for general
 | 
			
		||||
instructions to develop an iOS application that uses MediaPipe. The graph below
 | 
			
		||||
is used in the
 | 
			
		||||
[Hand Tracking GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/handtrackinggpu)
 | 
			
		||||
instructions to develop an iOS application that uses MediaPipe.
 | 
			
		||||
 | 
			
		||||
To build the iOS app, please see the general
 | 
			
		||||
The graph below is used in the
 | 
			
		||||
[Hand Tracking GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/handtrackinggpu).
 | 
			
		||||
To build the app, please see the general
 | 
			
		||||
[MediaPipe iOS app building and setup instructions](./mediapipe_ios_setup.md).
 | 
			
		||||
Specifically, run:
 | 
			
		||||
Specific to this example, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/handtrackinggpu:HandTrackingGpuApp
 | 
			
		||||
| 
						 | 
				
			
			@ -49,20 +58,21 @@ bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/handtrackinggpu:Han
 | 
			
		|||
 | 
			
		||||
## Graph
 | 
			
		||||
 | 
			
		||||
For more information on how to visualize a graph that includes subgraphs, see
 | 
			
		||||
[subgraph documentation](./visualizer.md#visualizing-subgraphs) for Visualizer.
 | 
			
		||||
The hand tracking [main graph](#main-graph) internally utilizes a
 | 
			
		||||
[hand detection subgraph](#hand-detection-subgraph), a
 | 
			
		||||
[hand landmark subgraph](#hand-landmark-subgraph) and a
 | 
			
		||||
[renderer subgraph](#renderer-subgraph).
 | 
			
		||||
 | 
			
		||||
The hand tracking graph is
 | 
			
		||||
[hand_tracking_mobile.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_tracking_mobile.pbtxt)
 | 
			
		||||
and it includes 3 [subgraphs](./framework_concepts.md#subgraph):
 | 
			
		||||
The subgraphs show up in the main graph visualization as nodes colored in
 | 
			
		||||
purple, and the subgraph itself can also be visualized just like a regular
 | 
			
		||||
graph. For more information on how to visualize a graph that includes subgraphs,
 | 
			
		||||
see [visualizing subgraphs](./visualizer.md#visualizing-subgraphs).
 | 
			
		||||
 | 
			
		||||
*   [HandDetectionSubgraph - hand_detection_gpu.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_gpu.pbtxt)
 | 
			
		||||
### Main Graph
 | 
			
		||||
 | 
			
		||||
*   [HandLandmarkSubgraph - hand_landmark_gpu.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_landmark_gpu.pbtxt)
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
*   [RendererSubgraph - renderer_gpu.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/renderer_gpu.pbtxt)
 | 
			
		||||
 | 
			
		||||
{width="400"}
 | 
			
		||||
[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_tracking_mobile.pbtxt)
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
# MediaPipe graph that performs hand tracking with TensorFlow Lite on GPU.
 | 
			
		||||
| 
						 | 
				
			
			@ -152,9 +162,15 @@ node {
 | 
			
		|||
}
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
{width="500"}
 | 
			
		||||
### Hand Detection Subgraph
 | 
			
		||||
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_gpu.pbtxt)
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
# MediaPipe hand detection subgraph.
 | 
			
		||||
 | 
			
		||||
type: "HandDetectionSubgraph"
 | 
			
		||||
 | 
			
		||||
input_stream: "input_video"
 | 
			
		||||
| 
						 | 
				
			
			@ -352,7 +368,11 @@ node {
 | 
			
		|||
}
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
{width="400"}
 | 
			
		||||
### Hand Landmark Subgraph
 | 
			
		||||
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_landmark_gpu.pbtxt)
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
# MediaPipe hand landmark localization subgraph.
 | 
			
		||||
| 
						 | 
				
			
			@ -532,7 +552,11 @@ node {
 | 
			
		|||
}
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
{width="500"}
 | 
			
		||||
### Renderer Subgraph
 | 
			
		||||
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/renderer_gpu.pbtxt)
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
# MediaPipe hand tracking rendering subgraph.
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -14,7 +14,7 @@ graph on Android.
 | 
			
		|||
A simple camera app for real-time Sobel edge detection applied to a live video
 | 
			
		||||
stream on an Android device.
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
## Setup
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			@ -56,7 +56,7 @@ node: {
 | 
			
		|||
 | 
			
		||||
A visualization of the graph is shown below:
 | 
			
		||||
 | 
			
		||||
{width="200"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
This graph has a single input stream named `input_video` for all incoming frames
 | 
			
		||||
that will be provided by your device's camera.
 | 
			
		||||
| 
						 | 
				
			
			@ -252,7 +252,7 @@ adb install bazel-bin/$APPLICATION_PATH/edgedetectiongpu.apk
 | 
			
		|||
Open the application on your device. It should display a screen with the text
 | 
			
		||||
`Hello World!`.
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
## Using the camera via `CameraX`
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			@ -369,7 +369,7 @@ Add the following line in the `$APPLICATION_PATH/res/values/strings.xml` file:
 | 
			
		|||
When the user doesn't grant camera permission, the screen will now look like
 | 
			
		||||
this:
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
Now, we will add the [`SurfaceTexture`] and [`SurfaceView`] objects to
 | 
			
		||||
`MainActivity`:
 | 
			
		||||
| 
						 | 
				
			
			@ -709,7 +709,7 @@ And that's it! You should now be able to successfully build and run the
 | 
			
		|||
application on the device and see Sobel edge detection running on a live camera
 | 
			
		||||
feed! Congrats!
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
If you ran into any issues, please see the full code of the tutorial
 | 
			
		||||
[here](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/edgedetectiongpu).
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -72,7 +72,7 @@
 | 
			
		|||
    This graph consists of 1 graph input stream (`in`) and 1 graph output stream
 | 
			
		||||
    (`out`), and 2 [`PassThroughCalculator`]s connected serially.
 | 
			
		||||
 | 
			
		||||
    {width="200"}
 | 
			
		||||
    
 | 
			
		||||
 | 
			
		||||
4.  Before running the graph, an `OutputStreamPoller` object is connected to the
 | 
			
		||||
    output stream in order to later retrieve the graph output, and a graph run
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -14,7 +14,7 @@ graph on iOS.
 | 
			
		|||
A simple camera app for real-time Sobel edge detection applied to a live video
 | 
			
		||||
stream on an iOS device.
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
## Setup
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			@ -54,7 +54,7 @@ node: {
 | 
			
		|||
 | 
			
		||||
A visualization of the graph is shown below:
 | 
			
		||||
 | 
			
		||||
{width="200"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
This graph has a single input stream named `input_video` for all incoming frames
 | 
			
		||||
that will be provided by your device's camera.
 | 
			
		||||
| 
						 | 
				
			
			@ -174,10 +174,11 @@ bazel build -c opt --config=ios_arm64 <$APPLICATION_PATH>:EdgeDetectionGpuApp'
 | 
			
		|||
```
 | 
			
		||||
 | 
			
		||||
For example, to build the `EdgeDetectionGpuApp` application in
 | 
			
		||||
`mediapipe/examples/ios/edgedetection`, use the following command:
 | 
			
		||||
`mediapipe/examples/ios/edgedetectiongpu`, use the following
 | 
			
		||||
command:
 | 
			
		||||
 | 
			
		||||
```
 | 
			
		||||
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/edgedetection:EdgeDetectionGpuApp
 | 
			
		||||
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/edgedetectiongpu:EdgeDetectionGpuApp
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
Then, go back to XCode, open Window > Devices and Simulators, select your
 | 
			
		||||
| 
						 | 
				
			
			@ -188,9 +189,9 @@ blank white screen.
 | 
			
		|||
 | 
			
		||||
## Use the camera for the live view feed
 | 
			
		||||
 | 
			
		||||
In this tutorial, we will use the `MediaPipeCameraInputSource` class to access
 | 
			
		||||
and grab frames from the camera. This class uses the `AVCaptureSession` API to
 | 
			
		||||
get the frames from the camera.
 | 
			
		||||
In this tutorial, we will use the `MPPCameraInputSource` class to access and
 | 
			
		||||
grab frames from the camera. This class uses the `AVCaptureSession` API to get
 | 
			
		||||
the frames from the camera.
 | 
			
		||||
 | 
			
		||||
But before using this class, change the `Info.plist` file to support camera
 | 
			
		||||
usage in the app.
 | 
			
		||||
| 
						 | 
				
			
			@ -198,7 +199,7 @@ usage in the app.
 | 
			
		|||
In `ViewController.m`, add the following import line:
 | 
			
		||||
 | 
			
		||||
```
 | 
			
		||||
#import "mediapipe/objc/MediaPipeCameraInputSource.h"
 | 
			
		||||
#import "mediapipe/objc/MPPCameraInputSource.h"
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
Add the following to its implementation block to create an object
 | 
			
		||||
| 
						 | 
				
			
			@ -207,7 +208,7 @@ Add the following to its implementation block to create an object
 | 
			
		|||
```
 | 
			
		||||
@implementation ViewController {
 | 
			
		||||
  // Handles camera access via AVCaptureSession library.
 | 
			
		||||
  MediaPipeCameraInputSource* _cameraSource;
 | 
			
		||||
  MPPCameraInputSource* _cameraSource;
 | 
			
		||||
}
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			@ -217,7 +218,7 @@ Add the following code to `viewDidLoad()`:
 | 
			
		|||
-(void)viewDidLoad {
 | 
			
		||||
  [super viewDidLoad];
 | 
			
		||||
 | 
			
		||||
  _cameraSource = [[MediaPipeCameraInputSource alloc] init];
 | 
			
		||||
  _cameraSource = [[MPPCameraInputSource alloc] init];
 | 
			
		||||
  _cameraSource.sessionPreset = AVCaptureSessionPresetHigh;
 | 
			
		||||
  _cameraSource.cameraPosition = AVCaptureDevicePositionBack;
 | 
			
		||||
  // The frame's native format is rotated with respect to the portrait orientation.
 | 
			
		||||
| 
						 | 
				
			
			@ -229,10 +230,10 @@ The code initializes `_cameraSource`, sets the capture session preset, and which
 | 
			
		|||
camera to use.
 | 
			
		||||
 | 
			
		||||
We need to get frames from the `_cameraSource` into our application
 | 
			
		||||
`ViewController` to display them. `MediaPipeCameraInputSource` is a subclass of
 | 
			
		||||
`MediaPipeInputSource`, which provides a protocol for its delegates, namely the
 | 
			
		||||
`MediaPipeInputSourceDelegate`. So our application `ViewController` can be a
 | 
			
		||||
delegate of `_cameraSource`.
 | 
			
		||||
`ViewController` to display them. `MPPCameraInputSource` is a subclass of
 | 
			
		||||
`MPPInputSource`, which provides a protocol for its delegates, namely the
 | 
			
		||||
`MPPInputSourceDelegate`. So our application `ViewController` can be a delegate
 | 
			
		||||
of `_cameraSource`.
 | 
			
		||||
 | 
			
		||||
To handle camera setup and process incoming frames, we should use a queue
 | 
			
		||||
different from the main queue. Add the following to the implementation block of
 | 
			
		||||
| 
						 | 
				
			
			@ -269,11 +270,11 @@ the interface/implementation of the `ViewController`:
 | 
			
		|||
static const char* kVideoQueueLabel = "com.google.mediapipe.example.videoQueue";
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
Before implementing any method from `MediaPipeInputSourceDelegate` protocol, we
 | 
			
		||||
must first set up a way to display the camera frames. MediaPipe provides another
 | 
			
		||||
utility called `MediaPipeLayerRenderer` to display images on the screen. This
 | 
			
		||||
utility can be used to display `CVPixelBufferRef` objects, which is the type of
 | 
			
		||||
the images provided by `MediaPipeCameraInputSource` to its delegates.
 | 
			
		||||
Before implementing any method from `MPPInputSourceDelegate` protocol, we must
 | 
			
		||||
first set up a way to display the camera frames. MediaPipe provides another
 | 
			
		||||
utility called `MPPLayerRenderer` to display images on the screen. This utility
 | 
			
		||||
can be used to display `CVPixelBufferRef` objects, which is the type of the
 | 
			
		||||
images provided by `MPPCameraInputSource` to its delegates.
 | 
			
		||||
 | 
			
		||||
To display images of the screen, we need to add a new `UIView` object called
 | 
			
		||||
`_liveView` to the `ViewController`.
 | 
			
		||||
| 
						 | 
				
			
			@ -284,7 +285,7 @@ Add the following lines to the implementation block of the `ViewController`:
 | 
			
		|||
// Display the camera preview frames.
 | 
			
		||||
IBOutlet UIView* _liveView;
 | 
			
		||||
// Render frames in a layer.
 | 
			
		||||
MediaPipeLayerRenderer* _renderer;
 | 
			
		||||
MPPLayerRenderer* _renderer;
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
Go to `Main.storyboard`, add a `UIView` object from the object library to the
 | 
			
		||||
| 
						 | 
				
			
			@ -296,7 +297,7 @@ Go back to `ViewController.m` and add the following code to `viewDidLoad()` to
 | 
			
		|||
initialize the `_renderer` object:
 | 
			
		||||
 | 
			
		||||
```
 | 
			
		||||
_renderer = [[MediaPipeLayerRenderer alloc] init];
 | 
			
		||||
_renderer = [[MPPLayerRenderer alloc] init];
 | 
			
		||||
_renderer.layer.frame = _liveView.layer.bounds;
 | 
			
		||||
[_liveView.layer addSublayer:_renderer.layer];
 | 
			
		||||
_renderer.frameScaleMode = MediaPipeFrameScaleFillAndCrop;
 | 
			
		||||
| 
						 | 
				
			
			@ -308,7 +309,7 @@ To get frames from the camera, we will implement the following method:
 | 
			
		|||
// Must be invoked on _videoQueue.
 | 
			
		||||
- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
 | 
			
		||||
                timestamp:(CMTime)timestamp
 | 
			
		||||
               fromSource:(MediaPipeInputSource*)source {
 | 
			
		||||
               fromSource:(MPPInputSource*)source {
 | 
			
		||||
  if (source != _cameraSource) {
 | 
			
		||||
    NSLog(@"Unknown source: %@", source);
 | 
			
		||||
    return;
 | 
			
		||||
| 
						 | 
				
			
			@ -322,7 +323,7 @@ To get frames from the camera, we will implement the following method:
 | 
			
		|||
}
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
This is a delegate method of `MediaPipeInputSource`. We first check that we are
 | 
			
		||||
This is a delegate method of `MPPInputSource`. We first check that we are
 | 
			
		||||
getting frames from the right source, i.e. the `_cameraSource`. Then we display
 | 
			
		||||
the frame received from the camera via `_renderer` on the main queue.
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			@ -337,7 +338,7 @@ about to appear. To do this, we will implement the
 | 
			
		|||
```
 | 
			
		||||
 | 
			
		||||
Before we start running the camera, we need the user's permission to access it.
 | 
			
		||||
`MediaPipeCameraInputSource` provides a function
 | 
			
		||||
`MPPCameraInputSource` provides a function
 | 
			
		||||
`requestCameraAccessWithCompletionHandler:(void (^_Nullable)(BOOL
 | 
			
		||||
granted))handler` to request camera access and do some work when the user has
 | 
			
		||||
responded. Add the following code to `viewWillAppear:animated`:
 | 
			
		||||
| 
						 | 
				
			
			@ -413,7 +414,7 @@ Add the following property to the interface of the `ViewController`:
 | 
			
		|||
```
 | 
			
		||||
// The MediaPipe graph currently in use. Initialized in viewDidLoad, started in viewWillAppear: and
 | 
			
		||||
// sent video frames on _videoQueue.
 | 
			
		||||
@property(nonatomic) MediaPipeGraph* mediapipeGraph;
 | 
			
		||||
@property(nonatomic) MPPGraph* mediapipeGraph;
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
As explained in the comment above, we will initialize this graph in
 | 
			
		||||
| 
						 | 
				
			
			@ -421,7 +422,7 @@ As explained in the comment above, we will initialize this graph in
 | 
			
		|||
using the following function:
 | 
			
		||||
 | 
			
		||||
```
 | 
			
		||||
+ (MediaPipeGraph*)loadGraphFromResource:(NSString*)resource {
 | 
			
		||||
+ (MPPGraph*)loadGraphFromResource:(NSString*)resource {
 | 
			
		||||
  // Load the graph config resource.
 | 
			
		||||
  NSError* configLoadError = nil;
 | 
			
		||||
  NSBundle* bundle = [NSBundle bundleForClass:[self class]];
 | 
			
		||||
| 
						 | 
				
			
			@ -440,7 +441,7 @@ using the following function:
 | 
			
		|||
  config.ParseFromArray(data.bytes, data.length);
 | 
			
		||||
 | 
			
		||||
  // Create MediaPipe graph with mediapipe::CalculatorGraphConfig proto object.
 | 
			
		||||
  MediaPipeGraph* newGraph = [[MediaPipeGraph alloc] initWithGraphConfig:config];
 | 
			
		||||
  MPPGraph* newGraph = [[MPPGraph alloc] initWithGraphConfig:config];
 | 
			
		||||
  [newGraph addFrameOutputStream:kOutputStream outputPacketType:MediaPipePacketPixelBuffer];
 | 
			
		||||
  return newGraph;
 | 
			
		||||
}
 | 
			
		||||
| 
						 | 
				
			
			@ -498,7 +499,7 @@ this function's implementation to do the following:
 | 
			
		|||
```
 | 
			
		||||
- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
 | 
			
		||||
                timestamp:(CMTime)timestamp
 | 
			
		||||
               fromSource:(MediaPipeInputSource*)source {
 | 
			
		||||
               fromSource:(MPPInputSource*)source {
 | 
			
		||||
  if (source != _cameraSource) {
 | 
			
		||||
    NSLog(@"Unknown source: %@", source);
 | 
			
		||||
    return;
 | 
			
		||||
| 
						 | 
				
			
			@ -518,9 +519,9 @@ The graph will run with this input packet and output a result in
 | 
			
		|||
method to receive packets on this output stream and display them on the screen:
 | 
			
		||||
 | 
			
		||||
```
 | 
			
		||||
- (void)mediapipeGraph:(MediaPipeGraph*)graph
 | 
			
		||||
    didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
 | 
			
		||||
              fromStream:(const std::string&)streamName {
 | 
			
		||||
- (void)mediapipeGraph:(MPPGraph*)graph
 | 
			
		||||
   didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
 | 
			
		||||
             fromStream:(const std::string&)streamName {
 | 
			
		||||
  if (streamName == kOutputStream) {
 | 
			
		||||
    // Display the captured image on the screen.
 | 
			
		||||
    CVPixelBufferRetain(pixelBuffer);
 | 
			
		||||
| 
						 | 
				
			
			@ -535,7 +536,7 @@ method to receive packets on this output stream and display them on the screen:
 | 
			
		|||
And that is all! Build and run the app on your iOS device. You should see the
 | 
			
		||||
results of running the edge detection graph on a live video feed. Congrats!
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
If you ran into any issues, please see the full code of the tutorial
 | 
			
		||||
[here](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/edgedetectiongpu).
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
							
								
								
									
										
											BIN
										
									
								
								mediapipe/docs/images/hello_world.png
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
		 After Width: | Height: | Size: 9.6 KiB  | 
| 
		 Before Width: | Height: | Size: 39 KiB  | 
| 
		 Before Width: | Height: | Size: 24 KiB After Width: | Height: | Size: 18 KiB  | 
| 
		 Before Width: | Height: | Size: 1.4 MiB After Width: | Height: | Size: 666 KiB  | 
| 
		 Before Width: | Height: | Size: 964 KiB After Width: | Height: | Size: 529 KiB  | 
							
								
								
									
										
											BIN
										
									
								
								mediapipe/docs/images/mobile/edge_detection_mobile_gpu.png
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
		 After Width: | Height: | Size: 12 KiB  | 
| 
		 Before Width: | Height: | Size: 37 KiB  | 
| 
		 Before Width: | Height: | Size: 1.8 MiB After Width: | Height: | Size: 1.8 MiB  | 
| 
		 Before Width: | Height: | Size: 80 KiB After Width: | Height: | Size: 64 KiB  | 
| 
		 Before Width: | Height: | Size: 105 KiB After Width: | Height: | Size: 84 KiB  | 
| 
		 Before Width: | Height: | Size: 65 KiB After Width: | Height: | Size: 49 KiB  | 
| 
		 Before Width: | Height: | Size: 28 KiB After Width: | Height: | Size: 20 KiB  | 
| 
		 Before Width: | Height: | Size: 94 KiB  | 
| 
		 Before Width: | Height: | Size: 80 KiB  | 
| 
		 Before Width: | Height: | Size: 220 KiB After Width: | Height: | Size: 94 KiB  | 
| 
		 Before Width: | Height: | Size: 156 KiB After Width: | Height: | Size: 64 KiB  | 
| 
						 | 
				
			
			@ -8,9 +8,9 @@ machine learning pipeline can be built as a graph of modular components,
 | 
			
		|||
including, for instance, inference models and media processing functions. Sensory
 | 
			
		||||
data such as audio and video streams enter the graph, and perceived descriptions
 | 
			
		||||
such as object-localization and face-landmark streams exit the graph. An example
 | 
			
		||||
graph that performs real-time hair segmentation on mobile GPU is shown below.
 | 
			
		||||
graph that performs real-time hand tracking on mobile GPU is shown below.
 | 
			
		||||
 | 
			
		||||
.. image:: images/mobile/hair_segmentation_android_gpu.png
 | 
			
		||||
.. image:: images/mobile/hand_tracking_mobile.png
 | 
			
		||||
   :width: 400
 | 
			
		||||
   :alt: Example MediaPipe graph
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			@ -29,11 +29,11 @@ APIs for MediaPipe
 | 
			
		|||
    * (Coming Soon) Graph Construction API in C++
 | 
			
		||||
    * Graph Execution API in C++
 | 
			
		||||
    * Graph Execution API in Java (Android)
 | 
			
		||||
    * (Coming Soon) Graph Execution API in Objective-C (iOS)
 | 
			
		||||
    * Graph Execution API in Objective-C (iOS)
 | 
			
		||||
 | 
			
		||||
Alpha Disclaimer
 | 
			
		||||
==================
 | 
			
		||||
MediaPipe is currently in alpha for v0.5. We are still making breaking API changes and expect to get to stable API by v1.0. We recommend that you target a specific version of MediaPipe, and periodically bump to the latest release. That way you have control over when a breaking change affects you.
 | 
			
		||||
MediaPipe is currently in alpha for v0.6. We are still making breaking API changes and expect to get to stable API by v1.0. We recommend that you target a specific version of MediaPipe, and periodically bump to the latest release. That way you have control over when a breaking change affects you.
 | 
			
		||||
 | 
			
		||||
User Documentation
 | 
			
		||||
==================
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -44,7 +44,7 @@ $ bazel-bin/mediapipe/examples/desktop/object_detection/object_detection_tensorf
 | 
			
		|||
 | 
			
		||||
#### Graph
 | 
			
		||||
 | 
			
		||||
{width="800"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
To visualize the graph as shown above, copy the text specification of the graph
 | 
			
		||||
below and paste it into
 | 
			
		||||
| 
						 | 
				
			
			@ -209,7 +209,7 @@ $ bazel-bin/mediapipe/examples/desktop/object_detection/object_detection_tflite
 | 
			
		|||
 | 
			
		||||
#### Graph
 | 
			
		||||
 | 
			
		||||
{width="400"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
To visualize the graph as shown above, copy the text specification of the graph
 | 
			
		||||
below and paste it into
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -1,8 +1,6 @@
 | 
			
		|||
# Object Detection (CPU)
 | 
			
		||||
 | 
			
		||||
Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
 | 
			
		||||
general instructions to develop an Android application that uses MediaPipe. This
 | 
			
		||||
doc focuses on the
 | 
			
		||||
This doc focuses on the
 | 
			
		||||
[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/object_detection/object_detection_mobile_cpu.pbtxt)
 | 
			
		||||
that performs object detection with TensorFlow Lite on CPU.
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			@ -11,22 +9,25 @@ This is very similar to the
 | 
			
		|||
except that at the beginning and the end of the graph it performs GPU-to-CPU and
 | 
			
		||||
CPU-to-GPU image transfer respectively. As a result, the rest of graph, which
 | 
			
		||||
shares the same configuration as the
 | 
			
		||||
[GPU graph](images/mobile/object_detection_android_gpu.png), runs entirely on
 | 
			
		||||
[GPU graph](images/mobile/object_detection_mobile_gpu.png), runs entirely on
 | 
			
		||||
CPU.
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
## Android
 | 
			
		||||
 | 
			
		||||
The graph is used in the
 | 
			
		||||
[Object Detection CPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectioncpu)
 | 
			
		||||
example app. To build the app, run:
 | 
			
		||||
Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
 | 
			
		||||
general instructions to develop an Android application that uses MediaPipe.
 | 
			
		||||
 | 
			
		||||
The graph below is used in the
 | 
			
		||||
[Object Detection CPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectioncpu).
 | 
			
		||||
To build the app, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectioncpu
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
To further install the app on android device, run:
 | 
			
		||||
To further install the app on an Android device, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectioncpu/objectdetectioncpu.apk
 | 
			
		||||
| 
						 | 
				
			
			@ -35,13 +36,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
 | 
			
		|||
## iOS
 | 
			
		||||
 | 
			
		||||
Please see [Hello World! in MediaPipe on iOS](hello_world_ios.md) for general
 | 
			
		||||
instructions to develop an iOS application that uses MediaPipe. The graph below
 | 
			
		||||
is used in the
 | 
			
		||||
[Object Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/objectdetectioncpu).
 | 
			
		||||
instructions to develop an iOS application that uses MediaPipe.
 | 
			
		||||
 | 
			
		||||
To build the iOS app, please see the general
 | 
			
		||||
The graph below is used in the
 | 
			
		||||
[Object Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/objectdetectioncpu).
 | 
			
		||||
To build the app, please see the general
 | 
			
		||||
[MediaPipe iOS app building and setup instructions](./mediapipe_ios_setup.md).
 | 
			
		||||
Specifically, run:
 | 
			
		||||
Specific to this example, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/objectdetectioncpu:ObjectDetectionCpuApp
 | 
			
		||||
| 
						 | 
				
			
			@ -49,11 +50,13 @@ bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/objectdetectioncpu:
 | 
			
		|||
 | 
			
		||||
## Graph
 | 
			
		||||
 | 
			
		||||
{width="400"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
To visualize the graph as shown above, copy the text specification of the graph
 | 
			
		||||
below and paste it into [MediaPipe Visualizer](https://viz.mediapipe.dev/).
 | 
			
		||||
 | 
			
		||||
[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/object_detection/object_detection_mobile_cpu.pbtxt)
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
# MediaPipe graph that performs object detection with TensorFlow Lite on CPU.
 | 
			
		||||
# Used in the example in
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -4,7 +4,7 @@ This doc focuses on the
 | 
			
		|||
[below example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/object_detection/object_detection_mobile_gpu.pbtxt)
 | 
			
		||||
that performs object detection with TensorFlow Lite on GPU.
 | 
			
		||||
 | 
			
		||||
{width="300"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
## Android
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			@ -12,14 +12,14 @@ Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
 | 
			
		|||
general instructions to develop an Android application that uses MediaPipe.
 | 
			
		||||
 | 
			
		||||
The graph below is used in the
 | 
			
		||||
[Object Detection GPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectiongpu)
 | 
			
		||||
example app. To build the app, run:
 | 
			
		||||
[Object Detection GPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectiongpu).
 | 
			
		||||
To build the app, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectiongpu
 | 
			
		||||
```
 | 
			
		||||
 | 
			
		||||
To further install the app on android device, run:
 | 
			
		||||
To further install the app on an Android device, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectiongpu/objectdetectiongpu.apk
 | 
			
		||||
| 
						 | 
				
			
			@ -28,13 +28,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
 | 
			
		|||
## iOS
 | 
			
		||||
 | 
			
		||||
Please see [Hello World! in MediaPipe on iOS](hello_world_ios.md) for general
 | 
			
		||||
instructions to develop an iOS application that uses MediaPipe. The graph below
 | 
			
		||||
is used in the
 | 
			
		||||
[Object Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/objectdetectiongpu)
 | 
			
		||||
instructions to develop an iOS application that uses MediaPipe.
 | 
			
		||||
 | 
			
		||||
To build the iOS app, please see the general
 | 
			
		||||
The graph below is used in the
 | 
			
		||||
[Object Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/objectdetectiongpu).
 | 
			
		||||
To build the app, please see the general
 | 
			
		||||
[MediaPipe iOS app building and setup instructions](./mediapipe_ios_setup.md).
 | 
			
		||||
Specifically, run:
 | 
			
		||||
Specific to this example, run:
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/objectdetectiongpu:ObjectDetectionGpuApp
 | 
			
		||||
| 
						 | 
				
			
			@ -42,11 +42,13 @@ bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/objectdetectiongpu:
 | 
			
		|||
 | 
			
		||||
## Graph
 | 
			
		||||
 | 
			
		||||
{width="400"}
 | 
			
		||||

 | 
			
		||||
 | 
			
		||||
To visualize the graph as shown above, copy the text specification of the graph
 | 
			
		||||
below and paste it into [MediaPipe Visualizer](https://viz.mediapipe.dev/).
 | 
			
		||||
 | 
			
		||||
[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/object_detection/object_detection_mobile_gpu.pbtxt)
 | 
			
		||||
 | 
			
		||||
```bash
 | 
			
		||||
# MediaPipe graph that performs object detection with TensorFlow Lite on GPU.
 | 
			
		||||
# Used in the example in
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -79,4 +79,4 @@ and its associated [subgraph](./framework_concepts.md#subgraph) called
 | 
			
		|||
 | 
			
		||||
*   Click on the subgraph block in purple `Hand Detection` and the
 | 
			
		||||
    `hand_detection_gpu.pbtxt` tab will open
 | 
			
		||||
    {width="1500"}
 | 
			
		||||
    {width="1500"}
 | 
			
		||||
| 
						 | 
				
			
			
 | 
			
		|||
| 
						 | 
				
			
			@ -0,0 +1,33 @@
 | 
			
		|||
<?xml version="1.0" encoding="utf-8"?>
 | 
			
		||||
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
 | 
			
		||||
    package="com.google.mediapipe.apps.handdetectiongpu">
 | 
			
		||||
 | 
			
		||||
  <uses-sdk
 | 
			
		||||
      android:minSdkVersion="21"
 | 
			
		||||
      android:targetSdkVersion="27" />
 | 
			
		||||
 | 
			
		||||
  <!-- For using the camera -->
 | 
			
		||||
  <uses-permission android:name="android.permission.CAMERA" />
 | 
			
		||||
  <uses-feature android:name="android.hardware.camera" />
 | 
			
		||||
  <uses-feature android:name="android.hardware.camera.autofocus" />
 | 
			
		||||
  <!-- For MediaPipe -->
 | 
			
		||||
  <uses-feature android:glEsVersion="0x00020000" android:required="true" />
 | 
			
		||||
 | 
			
		||||
 | 
			
		||||
  <application
 | 
			
		||||
      android:allowBackup="true"
 | 
			
		||||
      android:label="@string/app_name"
 | 
			
		||||
      android:supportsRtl="true"
 | 
			
		||||
      android:theme="@style/AppTheme">
 | 
			
		||||
      <activity
 | 
			
		||||
          android:name=".MainActivity"
 | 
			
		||||
          android:exported="true"
 | 
			
		||||
          android:screenOrientation="portrait">
 | 
			
		||||
          <intent-filter>
 | 
			
		||||
              <action android:name="android.intent.action.MAIN" />
 | 
			
		||||
              <category android:name="android.intent.category.LAUNCHER" />
 | 
			
		||||
          </intent-filter>
 | 
			
		||||
      </activity>
 | 
			
		||||
  </application>
 | 
			
		||||
 | 
			
		||||
</manifest>
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,83 @@
 | 
			
		|||
# Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
#
 | 
			
		||||
# Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
# you may not use this file except in compliance with the License.
 | 
			
		||||
# You may obtain a copy of the License at
 | 
			
		||||
#
 | 
			
		||||
#      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
#
 | 
			
		||||
# Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
# distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
# See the License for the specific language governing permissions and
 | 
			
		||||
# limitations under the License.
 | 
			
		||||
 | 
			
		||||
licenses(["notice"])  # Apache 2.0
 | 
			
		||||
 | 
			
		||||
package(default_visibility = ["//visibility:private"])
 | 
			
		||||
 | 
			
		||||
cc_binary(
 | 
			
		||||
    name = "libmediapipe_jni.so",
 | 
			
		||||
    linkshared = 1,
 | 
			
		||||
    linkstatic = 1,
 | 
			
		||||
    deps = [
 | 
			
		||||
        "//mediapipe/graphs/hand_tracking:detection_mobile_calculators",
 | 
			
		||||
        "//mediapipe/java/com/google/mediapipe/framework/jni:mediapipe_framework_jni",
 | 
			
		||||
    ],
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
cc_library(
 | 
			
		||||
    name = "mediapipe_jni_lib",
 | 
			
		||||
    srcs = [":libmediapipe_jni.so"],
 | 
			
		||||
    alwayslink = 1,
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
# Maps the binary graph to an alias (e.g., the app name) for convenience so that the alias can be
 | 
			
		||||
# easily incorporated into the app via, for example,
 | 
			
		||||
# MainActivity.BINARY_GRAPH_NAME = "appname.binarypb".
 | 
			
		||||
genrule(
 | 
			
		||||
    name = "binary_graph",
 | 
			
		||||
    srcs = ["//mediapipe/graphs/hand_tracking:hand_detection_mobile_gpu_binary_graph"],
 | 
			
		||||
    outs = ["handdetectiongpu.binarypb"],
 | 
			
		||||
    cmd = "cp $< $@",
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
android_library(
 | 
			
		||||
    name = "mediapipe_lib",
 | 
			
		||||
    srcs = glob(["*.java"]),
 | 
			
		||||
    assets = [
 | 
			
		||||
        ":binary_graph",
 | 
			
		||||
        "//mediapipe/models:palm_detection.tflite",
 | 
			
		||||
        "//mediapipe/models:palm_detection_labelmap.txt",
 | 
			
		||||
    ],
 | 
			
		||||
    assets_dir = "",
 | 
			
		||||
    manifest = "AndroidManifest.xml",
 | 
			
		||||
    resource_files = glob(["res/**"]),
 | 
			
		||||
    deps = [
 | 
			
		||||
        ":mediapipe_jni_lib",
 | 
			
		||||
        "//mediapipe/java/com/google/mediapipe/components:android_camerax_helper",
 | 
			
		||||
        "//mediapipe/java/com/google/mediapipe/components:android_components",
 | 
			
		||||
        "//mediapipe/java/com/google/mediapipe/framework:android_framework",
 | 
			
		||||
        "//mediapipe/java/com/google/mediapipe/glutil",
 | 
			
		||||
        "//third_party:androidx_appcompat",
 | 
			
		||||
        "//third_party:androidx_constraint_layout",
 | 
			
		||||
        "//third_party:androidx_legacy_support_v4",
 | 
			
		||||
        "//third_party:androidx_material",
 | 
			
		||||
        "//third_party:androidx_recyclerview",
 | 
			
		||||
        "//third_party:opencv",
 | 
			
		||||
        "@androidx_concurrent_futures//jar",
 | 
			
		||||
        "@androidx_lifecycle//jar",
 | 
			
		||||
        "@com_google_code_findbugs//jar",
 | 
			
		||||
        "@com_google_guava_android//jar",
 | 
			
		||||
    ],
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
android_binary(
 | 
			
		||||
    name = "handdetectiongpu",
 | 
			
		||||
    manifest = "AndroidManifest.xml",
 | 
			
		||||
    manifest_values = {"applicationId": "com.google.mediapipe.apps.handdetectiongpu"},
 | 
			
		||||
    multidex = "native",
 | 
			
		||||
    deps = [
 | 
			
		||||
        ":mediapipe_lib",
 | 
			
		||||
    ],
 | 
			
		||||
)
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,167 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
package com.google.mediapipe.apps.handdetectiongpu;
 | 
			
		||||
 | 
			
		||||
import android.graphics.SurfaceTexture;
 | 
			
		||||
import android.os.Bundle;
 | 
			
		||||
import androidx.appcompat.app.AppCompatActivity;
 | 
			
		||||
import android.util.Size;
 | 
			
		||||
import android.view.SurfaceHolder;
 | 
			
		||||
import android.view.SurfaceView;
 | 
			
		||||
import android.view.View;
 | 
			
		||||
import android.view.ViewGroup;
 | 
			
		||||
import com.google.mediapipe.components.CameraHelper;
 | 
			
		||||
import com.google.mediapipe.components.CameraXPreviewHelper;
 | 
			
		||||
import com.google.mediapipe.components.ExternalTextureConverter;
 | 
			
		||||
import com.google.mediapipe.components.FrameProcessor;
 | 
			
		||||
import com.google.mediapipe.components.PermissionHelper;
 | 
			
		||||
import com.google.mediapipe.framework.AndroidAssetUtil;
 | 
			
		||||
import com.google.mediapipe.glutil.EglManager;
 | 
			
		||||
 | 
			
		||||
/** Main activity of MediaPipe example apps. */
 | 
			
		||||
public class MainActivity extends AppCompatActivity {
 | 
			
		||||
  private static final String TAG = "MainActivity";
 | 
			
		||||
 | 
			
		||||
  private static final String BINARY_GRAPH_NAME = "handdetectiongpu.binarypb";
 | 
			
		||||
  private static final String INPUT_VIDEO_STREAM_NAME = "input_video";
 | 
			
		||||
  private static final String OUTPUT_VIDEO_STREAM_NAME = "output_video";
 | 
			
		||||
  private static final CameraHelper.CameraFacing CAMERA_FACING = CameraHelper.CameraFacing.FRONT;
 | 
			
		||||
 | 
			
		||||
  // Flips the camera-preview frames vertically before sending them into FrameProcessor to be
 | 
			
		||||
  // processed in a MediaPipe graph, and flips the processed frames back when they are displayed.
 | 
			
		||||
  // This is needed because OpenGL represents images assuming the image origin is at the bottom-left
 | 
			
		||||
  // corner, whereas MediaPipe in general assumes the image origin is at top-left.
 | 
			
		||||
  private static final boolean FLIP_FRAMES_VERTICALLY = true;
 | 
			
		||||
 | 
			
		||||
  static {
 | 
			
		||||
    // Load all native libraries needed by the app.
 | 
			
		||||
    System.loadLibrary("mediapipe_jni");
 | 
			
		||||
    System.loadLibrary("opencv_java4");
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  // {@link SurfaceTexture} where the camera-preview frames can be accessed.
 | 
			
		||||
  private SurfaceTexture previewFrameTexture;
 | 
			
		||||
  // {@link SurfaceView} that displays the camera-preview frames processed by a MediaPipe graph.
 | 
			
		||||
  private SurfaceView previewDisplayView;
 | 
			
		||||
 | 
			
		||||
  // Creates and manages an {@link EGLContext}.
 | 
			
		||||
  private EglManager eglManager;
 | 
			
		||||
  // Sends camera-preview frames into a MediaPipe graph for processing, and displays the processed
 | 
			
		||||
  // frames onto a {@link Surface}.
 | 
			
		||||
  private FrameProcessor processor;
 | 
			
		||||
  // Converts the GL_TEXTURE_EXTERNAL_OES texture from Android camera into a regular texture to be
 | 
			
		||||
  // consumed by {@link FrameProcessor} and the underlying MediaPipe graph.
 | 
			
		||||
  private ExternalTextureConverter converter;
 | 
			
		||||
 | 
			
		||||
  // Handles camera access via the {@link CameraX} Jetpack support library.
 | 
			
		||||
  private CameraXPreviewHelper cameraHelper;
 | 
			
		||||
 | 
			
		||||
  @Override
 | 
			
		||||
  protected void onCreate(Bundle savedInstanceState) {
 | 
			
		||||
    super.onCreate(savedInstanceState);
 | 
			
		||||
    setContentView(R.layout.activity_main);
 | 
			
		||||
 | 
			
		||||
    previewDisplayView = new SurfaceView(this);
 | 
			
		||||
    setupPreviewDisplayView();
 | 
			
		||||
 | 
			
		||||
    // Initilize asset manager so that MediaPipe native libraries can access the app assets, e.g.,
 | 
			
		||||
    // binary graphs.
 | 
			
		||||
    AndroidAssetUtil.initializeNativeAssetManager(this);
 | 
			
		||||
 | 
			
		||||
    eglManager = new EglManager(null);
 | 
			
		||||
    processor =
 | 
			
		||||
        new FrameProcessor(
 | 
			
		||||
            this,
 | 
			
		||||
            eglManager.getNativeContext(),
 | 
			
		||||
            BINARY_GRAPH_NAME,
 | 
			
		||||
            INPUT_VIDEO_STREAM_NAME,
 | 
			
		||||
            OUTPUT_VIDEO_STREAM_NAME);
 | 
			
		||||
    processor.getVideoSurfaceOutput().setFlipY(FLIP_FRAMES_VERTICALLY);
 | 
			
		||||
 | 
			
		||||
    PermissionHelper.checkAndRequestCameraPermissions(this);
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  @Override
 | 
			
		||||
  protected void onResume() {
 | 
			
		||||
    super.onResume();
 | 
			
		||||
    converter = new ExternalTextureConverter(eglManager.getContext());
 | 
			
		||||
    converter.setFlipY(FLIP_FRAMES_VERTICALLY);
 | 
			
		||||
    converter.setConsumer(processor);
 | 
			
		||||
    if (PermissionHelper.cameraPermissionsGranted(this)) {
 | 
			
		||||
      startCamera();
 | 
			
		||||
    }
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  @Override
 | 
			
		||||
  protected void onPause() {
 | 
			
		||||
    super.onPause();
 | 
			
		||||
    converter.close();
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  @Override
 | 
			
		||||
  public void onRequestPermissionsResult(
 | 
			
		||||
      int requestCode, String[] permissions, int[] grantResults) {
 | 
			
		||||
    super.onRequestPermissionsResult(requestCode, permissions, grantResults);
 | 
			
		||||
    PermissionHelper.onRequestPermissionsResult(requestCode, permissions, grantResults);
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  private void setupPreviewDisplayView() {
 | 
			
		||||
    previewDisplayView.setVisibility(View.GONE);
 | 
			
		||||
    ViewGroup viewGroup = findViewById(R.id.preview_display_layout);
 | 
			
		||||
    viewGroup.addView(previewDisplayView);
 | 
			
		||||
 | 
			
		||||
    previewDisplayView
 | 
			
		||||
        .getHolder()
 | 
			
		||||
        .addCallback(
 | 
			
		||||
            new SurfaceHolder.Callback() {
 | 
			
		||||
              @Override
 | 
			
		||||
              public void surfaceCreated(SurfaceHolder holder) {
 | 
			
		||||
                processor.getVideoSurfaceOutput().setSurface(holder.getSurface());
 | 
			
		||||
              }
 | 
			
		||||
 | 
			
		||||
              @Override
 | 
			
		||||
              public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
 | 
			
		||||
                // (Re-)Compute the ideal size of the camera-preview display (the area that the
 | 
			
		||||
                // camera-preview frames get rendered onto, potentially with scaling and rotation)
 | 
			
		||||
                // based on the size of the SurfaceView that contains the display.
 | 
			
		||||
                Size viewSize = new Size(width, height);
 | 
			
		||||
                Size displaySize = cameraHelper.computeDisplaySizeFromViewSize(viewSize);
 | 
			
		||||
 | 
			
		||||
                // Connect the converter to the camera-preview frames as its input (via
 | 
			
		||||
                // previewFrameTexture), and configure the output width and height as the computed
 | 
			
		||||
                // display size.
 | 
			
		||||
                converter.setSurfaceTextureAndAttachToGLContext(
 | 
			
		||||
                    previewFrameTexture, displaySize.getWidth(), displaySize.getHeight());
 | 
			
		||||
              }
 | 
			
		||||
 | 
			
		||||
              @Override
 | 
			
		||||
              public void surfaceDestroyed(SurfaceHolder holder) {
 | 
			
		||||
                processor.getVideoSurfaceOutput().setSurface(null);
 | 
			
		||||
              }
 | 
			
		||||
            });
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  private void startCamera() {
 | 
			
		||||
    cameraHelper = new CameraXPreviewHelper();
 | 
			
		||||
    cameraHelper.setOnCameraStartedListener(
 | 
			
		||||
        surfaceTexture -> {
 | 
			
		||||
          previewFrameTexture = surfaceTexture;
 | 
			
		||||
          // Make the display view visible to start showing the preview. This triggers the
 | 
			
		||||
          // SurfaceHolder.Callback added to (the holder of) previewDisplayView.
 | 
			
		||||
          previewDisplayView.setVisibility(View.VISIBLE);
 | 
			
		||||
        });
 | 
			
		||||
    cameraHelper.startCamera(this, CAMERA_FACING, /*surfaceTexture=*/ null);
 | 
			
		||||
  }
 | 
			
		||||
}
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,20 @@
 | 
			
		|||
<?xml version="1.0" encoding="utf-8"?>
 | 
			
		||||
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
 | 
			
		||||
    xmlns:app="http://schemas.android.com/apk/res-auto"
 | 
			
		||||
    xmlns:tools="http://schemas.android.com/tools"
 | 
			
		||||
    android:layout_width="match_parent"
 | 
			
		||||
    android:layout_height="match_parent">
 | 
			
		||||
 | 
			
		||||
    <FrameLayout
 | 
			
		||||
        android:id="@+id/preview_display_layout"
 | 
			
		||||
        android:layout_width="fill_parent"
 | 
			
		||||
        android:layout_height="fill_parent"
 | 
			
		||||
        android:layout_weight="1">
 | 
			
		||||
        <TextView
 | 
			
		||||
            android:id="@+id/no_camera_access_view"
 | 
			
		||||
            android:layout_height="fill_parent"
 | 
			
		||||
            android:layout_width="fill_parent"
 | 
			
		||||
            android:gravity="center"
 | 
			
		||||
            android:text="@string/no_camera_access" />
 | 
			
		||||
    </FrameLayout>
 | 
			
		||||
</androidx.constraintlayout.widget.ConstraintLayout>
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,6 @@
 | 
			
		|||
<?xml version="1.0" encoding="utf-8"?>
 | 
			
		||||
<resources>
 | 
			
		||||
    <color name="colorPrimary">#008577</color>
 | 
			
		||||
    <color name="colorPrimaryDark">#00574B</color>
 | 
			
		||||
    <color name="colorAccent">#D81B60</color>
 | 
			
		||||
</resources>
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,4 @@
 | 
			
		|||
<resources>
 | 
			
		||||
    <string name="app_name" translatable="false">Hand Detection GPU</string>
 | 
			
		||||
    <string name="no_camera_access" translatable="false">Please grant camera permissions.</string>
 | 
			
		||||
</resources>
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,11 @@
 | 
			
		|||
<resources>
 | 
			
		||||
 | 
			
		||||
    <!-- Base application theme. -->
 | 
			
		||||
    <style name="AppTheme" parent="Theme.AppCompat.Light.DarkActionBar">
 | 
			
		||||
        <!-- Customize your theme here. -->
 | 
			
		||||
        <item name="colorPrimary">@color/colorPrimary</item>
 | 
			
		||||
        <item name="colorPrimaryDark">@color/colorPrimaryDark</item>
 | 
			
		||||
        <item name="colorAccent">@color/colorAccent</item>
 | 
			
		||||
    </style>
 | 
			
		||||
 | 
			
		||||
</resources>
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,33 @@
 | 
			
		|||
<?xml version="1.0" encoding="utf-8"?>
 | 
			
		||||
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
 | 
			
		||||
    package="com.google.mediapipe.apps.handtrackinggpu">
 | 
			
		||||
 | 
			
		||||
  <uses-sdk
 | 
			
		||||
      android:minSdkVersion="21"
 | 
			
		||||
      android:targetSdkVersion="27" />
 | 
			
		||||
 | 
			
		||||
  <!-- For using the camera -->
 | 
			
		||||
  <uses-permission android:name="android.permission.CAMERA" />
 | 
			
		||||
  <uses-feature android:name="android.hardware.camera" />
 | 
			
		||||
  <uses-feature android:name="android.hardware.camera.autofocus" />
 | 
			
		||||
  <!-- For MediaPipe -->
 | 
			
		||||
  <uses-feature android:glEsVersion="0x00020000" android:required="true" />
 | 
			
		||||
 | 
			
		||||
 | 
			
		||||
  <application
 | 
			
		||||
      android:allowBackup="true"
 | 
			
		||||
      android:label="@string/app_name"
 | 
			
		||||
      android:supportsRtl="true"
 | 
			
		||||
      android:theme="@style/AppTheme">
 | 
			
		||||
      <activity
 | 
			
		||||
          android:name=".MainActivity"
 | 
			
		||||
          android:exported="true"
 | 
			
		||||
          android:screenOrientation="portrait">
 | 
			
		||||
          <intent-filter>
 | 
			
		||||
              <action android:name="android.intent.action.MAIN" />
 | 
			
		||||
              <category android:name="android.intent.category.LAUNCHER" />
 | 
			
		||||
          </intent-filter>
 | 
			
		||||
      </activity>
 | 
			
		||||
  </application>
 | 
			
		||||
 | 
			
		||||
</manifest>
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,103 @@
 | 
			
		|||
# Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
#
 | 
			
		||||
# Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
# you may not use this file except in compliance with the License.
 | 
			
		||||
# You may obtain a copy of the License at
 | 
			
		||||
#
 | 
			
		||||
#      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
#
 | 
			
		||||
# Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
# distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
# See the License for the specific language governing permissions and
 | 
			
		||||
# limitations under the License.
 | 
			
		||||
 | 
			
		||||
licenses(["notice"])  # Apache 2.0
 | 
			
		||||
 | 
			
		||||
package(default_visibility = ["//visibility:private"])
 | 
			
		||||
 | 
			
		||||
cc_binary(
 | 
			
		||||
    name = "libmediapipe_jni.so",
 | 
			
		||||
    linkshared = 1,
 | 
			
		||||
    linkstatic = 1,
 | 
			
		||||
    deps = [
 | 
			
		||||
        "//mediapipe/graphs/hand_tracking:mobile_calculators",
 | 
			
		||||
        "//mediapipe/java/com/google/mediapipe/framework/jni:mediapipe_framework_jni",
 | 
			
		||||
    ],
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
cc_library(
 | 
			
		||||
    name = "mediapipe_jni_lib",
 | 
			
		||||
    srcs = [":libmediapipe_jni.so"],
 | 
			
		||||
    alwayslink = 1,
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
# Maps the binary graph to an alias (e.g., the app name) for convenience so that the alias can be
 | 
			
		||||
# easily incorporated into the app via, for example,
 | 
			
		||||
# MainActivity.BINARY_GRAPH_NAME = "appname.binarypb".
 | 
			
		||||
genrule(
 | 
			
		||||
    name = "binary_graph",
 | 
			
		||||
    srcs = ["//mediapipe/graphs/hand_tracking:hand_tracking_mobile_gpu_binary_graph"],
 | 
			
		||||
    outs = ["handtrackinggpu.binarypb"],
 | 
			
		||||
    cmd = "cp $< $@",
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
# To use the 3D model instead of the default 2D model, add "--define 3D=true" to the
 | 
			
		||||
# bazel build command.
 | 
			
		||||
config_setting(
 | 
			
		||||
    name = "use_3d_model",
 | 
			
		||||
    define_values = {
 | 
			
		||||
        "3D": "true",
 | 
			
		||||
    },
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
genrule(
 | 
			
		||||
    name = "model",
 | 
			
		||||
    srcs = select({
 | 
			
		||||
        "//conditions:default": ["//mediapipe/models:hand_landmark.tflite"],
 | 
			
		||||
        ":use_3d_model": ["//mediapipe/models:hand_landmark_3d.tflite"],
 | 
			
		||||
    }),
 | 
			
		||||
    outs = ["hand_landmark.tflite"],
 | 
			
		||||
    cmd = "cp $< $@",
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
android_library(
 | 
			
		||||
    name = "mediapipe_lib",
 | 
			
		||||
    srcs = glob(["*.java"]),
 | 
			
		||||
    assets = [
 | 
			
		||||
        ":binary_graph",
 | 
			
		||||
        ":model",
 | 
			
		||||
        "//mediapipe/models:palm_detection.tflite",
 | 
			
		||||
        "//mediapipe/models:palm_detection_labelmap.txt",
 | 
			
		||||
    ],
 | 
			
		||||
    assets_dir = "",
 | 
			
		||||
    manifest = "AndroidManifest.xml",
 | 
			
		||||
    resource_files = glob(["res/**"]),
 | 
			
		||||
    deps = [
 | 
			
		||||
        ":mediapipe_jni_lib",
 | 
			
		||||
        "//mediapipe/java/com/google/mediapipe/components:android_camerax_helper",
 | 
			
		||||
        "//mediapipe/java/com/google/mediapipe/components:android_components",
 | 
			
		||||
        "//mediapipe/java/com/google/mediapipe/framework:android_framework",
 | 
			
		||||
        "//mediapipe/java/com/google/mediapipe/glutil",
 | 
			
		||||
        "//third_party:androidx_appcompat",
 | 
			
		||||
        "//third_party:androidx_constraint_layout",
 | 
			
		||||
        "//third_party:androidx_legacy_support_v4",
 | 
			
		||||
        "//third_party:androidx_material",
 | 
			
		||||
        "//third_party:androidx_recyclerview",
 | 
			
		||||
        "//third_party:opencv",
 | 
			
		||||
        "@androidx_concurrent_futures//jar",
 | 
			
		||||
        "@androidx_lifecycle//jar",
 | 
			
		||||
        "@com_google_code_findbugs//jar",
 | 
			
		||||
        "@com_google_guava_android//jar",
 | 
			
		||||
    ],
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
android_binary(
 | 
			
		||||
    name = "handtrackinggpu",
 | 
			
		||||
    manifest = "AndroidManifest.xml",
 | 
			
		||||
    manifest_values = {"applicationId": "com.google.mediapipe.apps.handtrackinggpu"},
 | 
			
		||||
    multidex = "native",
 | 
			
		||||
    deps = [
 | 
			
		||||
        ":mediapipe_lib",
 | 
			
		||||
    ],
 | 
			
		||||
)
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,167 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
package com.google.mediapipe.apps.handtrackinggpu;
 | 
			
		||||
 | 
			
		||||
import android.graphics.SurfaceTexture;
 | 
			
		||||
import android.os.Bundle;
 | 
			
		||||
import androidx.appcompat.app.AppCompatActivity;
 | 
			
		||||
import android.util.Size;
 | 
			
		||||
import android.view.SurfaceHolder;
 | 
			
		||||
import android.view.SurfaceView;
 | 
			
		||||
import android.view.View;
 | 
			
		||||
import android.view.ViewGroup;
 | 
			
		||||
import com.google.mediapipe.components.CameraHelper;
 | 
			
		||||
import com.google.mediapipe.components.CameraXPreviewHelper;
 | 
			
		||||
import com.google.mediapipe.components.ExternalTextureConverter;
 | 
			
		||||
import com.google.mediapipe.components.FrameProcessor;
 | 
			
		||||
import com.google.mediapipe.components.PermissionHelper;
 | 
			
		||||
import com.google.mediapipe.framework.AndroidAssetUtil;
 | 
			
		||||
import com.google.mediapipe.glutil.EglManager;
 | 
			
		||||
 | 
			
		||||
/** Main activity of MediaPipe example apps. */
 | 
			
		||||
public class MainActivity extends AppCompatActivity {
 | 
			
		||||
  private static final String TAG = "MainActivity";
 | 
			
		||||
 | 
			
		||||
  private static final String BINARY_GRAPH_NAME = "handtrackinggpu.binarypb";
 | 
			
		||||
  private static final String INPUT_VIDEO_STREAM_NAME = "input_video";
 | 
			
		||||
  private static final String OUTPUT_VIDEO_STREAM_NAME = "output_video";
 | 
			
		||||
  private static final CameraHelper.CameraFacing CAMERA_FACING = CameraHelper.CameraFacing.FRONT;
 | 
			
		||||
 | 
			
		||||
  // Flips the camera-preview frames vertically before sending them into FrameProcessor to be
 | 
			
		||||
  // processed in a MediaPipe graph, and flips the processed frames back when they are displayed.
 | 
			
		||||
  // This is needed because OpenGL represents images assuming the image origin is at the bottom-left
 | 
			
		||||
  // corner, whereas MediaPipe in general assumes the image origin is at top-left.
 | 
			
		||||
  private static final boolean FLIP_FRAMES_VERTICALLY = true;
 | 
			
		||||
 | 
			
		||||
  static {
 | 
			
		||||
    // Load all native libraries needed by the app.
 | 
			
		||||
    System.loadLibrary("mediapipe_jni");
 | 
			
		||||
    System.loadLibrary("opencv_java4");
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  // {@link SurfaceTexture} where the camera-preview frames can be accessed.
 | 
			
		||||
  private SurfaceTexture previewFrameTexture;
 | 
			
		||||
  // {@link SurfaceView} that displays the camera-preview frames processed by a MediaPipe graph.
 | 
			
		||||
  private SurfaceView previewDisplayView;
 | 
			
		||||
 | 
			
		||||
  // Creates and manages an {@link EGLContext}.
 | 
			
		||||
  private EglManager eglManager;
 | 
			
		||||
  // Sends camera-preview frames into a MediaPipe graph for processing, and displays the processed
 | 
			
		||||
  // frames onto a {@link Surface}.
 | 
			
		||||
  private FrameProcessor processor;
 | 
			
		||||
  // Converts the GL_TEXTURE_EXTERNAL_OES texture from Android camera into a regular texture to be
 | 
			
		||||
  // consumed by {@link FrameProcessor} and the underlying MediaPipe graph.
 | 
			
		||||
  private ExternalTextureConverter converter;
 | 
			
		||||
 | 
			
		||||
  // Handles camera access via the {@link CameraX} Jetpack support library.
 | 
			
		||||
  private CameraXPreviewHelper cameraHelper;
 | 
			
		||||
 | 
			
		||||
  @Override
 | 
			
		||||
  protected void onCreate(Bundle savedInstanceState) {
 | 
			
		||||
    super.onCreate(savedInstanceState);
 | 
			
		||||
    setContentView(R.layout.activity_main);
 | 
			
		||||
 | 
			
		||||
    previewDisplayView = new SurfaceView(this);
 | 
			
		||||
    setupPreviewDisplayView();
 | 
			
		||||
 | 
			
		||||
    // Initilize asset manager so that MediaPipe native libraries can access the app assets, e.g.,
 | 
			
		||||
    // binary graphs.
 | 
			
		||||
    AndroidAssetUtil.initializeNativeAssetManager(this);
 | 
			
		||||
 | 
			
		||||
    eglManager = new EglManager(null);
 | 
			
		||||
    processor =
 | 
			
		||||
        new FrameProcessor(
 | 
			
		||||
            this,
 | 
			
		||||
            eglManager.getNativeContext(),
 | 
			
		||||
            BINARY_GRAPH_NAME,
 | 
			
		||||
            INPUT_VIDEO_STREAM_NAME,
 | 
			
		||||
            OUTPUT_VIDEO_STREAM_NAME);
 | 
			
		||||
    processor.getVideoSurfaceOutput().setFlipY(FLIP_FRAMES_VERTICALLY);
 | 
			
		||||
 | 
			
		||||
    PermissionHelper.checkAndRequestCameraPermissions(this);
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  @Override
 | 
			
		||||
  protected void onResume() {
 | 
			
		||||
    super.onResume();
 | 
			
		||||
    converter = new ExternalTextureConverter(eglManager.getContext());
 | 
			
		||||
    converter.setFlipY(FLIP_FRAMES_VERTICALLY);
 | 
			
		||||
    converter.setConsumer(processor);
 | 
			
		||||
    if (PermissionHelper.cameraPermissionsGranted(this)) {
 | 
			
		||||
      startCamera();
 | 
			
		||||
    }
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  @Override
 | 
			
		||||
  protected void onPause() {
 | 
			
		||||
    super.onPause();
 | 
			
		||||
    converter.close();
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  @Override
 | 
			
		||||
  public void onRequestPermissionsResult(
 | 
			
		||||
      int requestCode, String[] permissions, int[] grantResults) {
 | 
			
		||||
    super.onRequestPermissionsResult(requestCode, permissions, grantResults);
 | 
			
		||||
    PermissionHelper.onRequestPermissionsResult(requestCode, permissions, grantResults);
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  private void setupPreviewDisplayView() {
 | 
			
		||||
    previewDisplayView.setVisibility(View.GONE);
 | 
			
		||||
    ViewGroup viewGroup = findViewById(R.id.preview_display_layout);
 | 
			
		||||
    viewGroup.addView(previewDisplayView);
 | 
			
		||||
 | 
			
		||||
    previewDisplayView
 | 
			
		||||
        .getHolder()
 | 
			
		||||
        .addCallback(
 | 
			
		||||
            new SurfaceHolder.Callback() {
 | 
			
		||||
              @Override
 | 
			
		||||
              public void surfaceCreated(SurfaceHolder holder) {
 | 
			
		||||
                processor.getVideoSurfaceOutput().setSurface(holder.getSurface());
 | 
			
		||||
              }
 | 
			
		||||
 | 
			
		||||
              @Override
 | 
			
		||||
              public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
 | 
			
		||||
                // (Re-)Compute the ideal size of the camera-preview display (the area that the
 | 
			
		||||
                // camera-preview frames get rendered onto, potentially with scaling and rotation)
 | 
			
		||||
                // based on the size of the SurfaceView that contains the display.
 | 
			
		||||
                Size viewSize = new Size(width, height);
 | 
			
		||||
                Size displaySize = cameraHelper.computeDisplaySizeFromViewSize(viewSize);
 | 
			
		||||
 | 
			
		||||
                // Connect the converter to the camera-preview frames as its input (via
 | 
			
		||||
                // previewFrameTexture), and configure the output width and height as the computed
 | 
			
		||||
                // display size.
 | 
			
		||||
                converter.setSurfaceTextureAndAttachToGLContext(
 | 
			
		||||
                    previewFrameTexture, displaySize.getWidth(), displaySize.getHeight());
 | 
			
		||||
              }
 | 
			
		||||
 | 
			
		||||
              @Override
 | 
			
		||||
              public void surfaceDestroyed(SurfaceHolder holder) {
 | 
			
		||||
                processor.getVideoSurfaceOutput().setSurface(null);
 | 
			
		||||
              }
 | 
			
		||||
            });
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  private void startCamera() {
 | 
			
		||||
    cameraHelper = new CameraXPreviewHelper();
 | 
			
		||||
    cameraHelper.setOnCameraStartedListener(
 | 
			
		||||
        surfaceTexture -> {
 | 
			
		||||
          previewFrameTexture = surfaceTexture;
 | 
			
		||||
          // Make the display view visible to start showing the preview. This triggers the
 | 
			
		||||
          // SurfaceHolder.Callback added to (the holder of) previewDisplayView.
 | 
			
		||||
          previewDisplayView.setVisibility(View.VISIBLE);
 | 
			
		||||
        });
 | 
			
		||||
    cameraHelper.startCamera(this, CAMERA_FACING, /*surfaceTexture=*/ null);
 | 
			
		||||
  }
 | 
			
		||||
}
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,20 @@
 | 
			
		|||
<?xml version="1.0" encoding="utf-8"?>
 | 
			
		||||
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
 | 
			
		||||
    xmlns:app="http://schemas.android.com/apk/res-auto"
 | 
			
		||||
    xmlns:tools="http://schemas.android.com/tools"
 | 
			
		||||
    android:layout_width="match_parent"
 | 
			
		||||
    android:layout_height="match_parent">
 | 
			
		||||
 | 
			
		||||
    <FrameLayout
 | 
			
		||||
        android:id="@+id/preview_display_layout"
 | 
			
		||||
        android:layout_width="fill_parent"
 | 
			
		||||
        android:layout_height="fill_parent"
 | 
			
		||||
        android:layout_weight="1">
 | 
			
		||||
        <TextView
 | 
			
		||||
            android:id="@+id/no_camera_access_view"
 | 
			
		||||
            android:layout_height="fill_parent"
 | 
			
		||||
            android:layout_width="fill_parent"
 | 
			
		||||
            android:gravity="center"
 | 
			
		||||
            android:text="@string/no_camera_access" />
 | 
			
		||||
    </FrameLayout>
 | 
			
		||||
</androidx.constraintlayout.widget.ConstraintLayout>
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,6 @@
 | 
			
		|||
<?xml version="1.0" encoding="utf-8"?>
 | 
			
		||||
<resources>
 | 
			
		||||
    <color name="colorPrimary">#008577</color>
 | 
			
		||||
    <color name="colorPrimaryDark">#00574B</color>
 | 
			
		||||
    <color name="colorAccent">#D81B60</color>
 | 
			
		||||
</resources>
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,4 @@
 | 
			
		|||
<resources>
 | 
			
		||||
    <string name="app_name" translatable="false">Hand Tracking GPU</string>
 | 
			
		||||
    <string name="no_camera_access" translatable="false">Please grant camera permissions.</string>
 | 
			
		||||
</resources>
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,11 @@
 | 
			
		|||
<resources>
 | 
			
		||||
 | 
			
		||||
    <!-- Base application theme. -->
 | 
			
		||||
    <style name="AppTheme" parent="Theme.AppCompat.Light.DarkActionBar">
 | 
			
		||||
        <!-- Customize your theme here. -->
 | 
			
		||||
        <item name="colorPrimary">@color/colorPrimary</item>
 | 
			
		||||
        <item name="colorPrimaryDark">@color/colorPrimaryDark</item>
 | 
			
		||||
        <item name="colorAccent">@color/colorAccent</item>
 | 
			
		||||
    </style>
 | 
			
		||||
 | 
			
		||||
</resources>
 | 
			
		||||
							
								
								
									
										21
									
								
								mediapipe/examples/ios/handdetectiongpu/AppDelegate.h
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,21 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
#import <UIKit/UIKit.h>
 | 
			
		||||
 | 
			
		||||
@interface AppDelegate : UIResponder <UIApplicationDelegate>
 | 
			
		||||
 | 
			
		||||
@property(strong, nonatomic) UIWindow *window;
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
							
								
								
									
										59
									
								
								mediapipe/examples/ios/handdetectiongpu/AppDelegate.m
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,59 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
#import "AppDelegate.h"
 | 
			
		||||
 | 
			
		||||
@interface AppDelegate ()
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
 | 
			
		||||
@implementation AppDelegate
 | 
			
		||||
 | 
			
		||||
- (BOOL)application:(UIApplication *)application
 | 
			
		||||
    didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
 | 
			
		||||
  // Override point for customization after application launch.
 | 
			
		||||
  return YES;
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)applicationWillResignActive:(UIApplication *)application {
 | 
			
		||||
  // Sent when the application is about to move from active to inactive state. This can occur for
 | 
			
		||||
  // certain types of temporary interruptions (such as an incoming phone call or SMS message) or
 | 
			
		||||
  // when the user quits the application and it begins the transition to the background state. Use
 | 
			
		||||
  // this method to pause ongoing tasks, disable timers, and invalidate graphics rendering
 | 
			
		||||
  // callbacks. Games should use this method to pause the game.
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)applicationDidEnterBackground:(UIApplication *)application {
 | 
			
		||||
  // Use this method to release shared resources, save user data, invalidate timers, and store
 | 
			
		||||
  // enough application state information to restore your application to its current state in case
 | 
			
		||||
  // it is terminated later. If your application supports background execution, this method is
 | 
			
		||||
  // called instead of applicationWillTerminate: when the user quits.
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)applicationWillEnterForeground:(UIApplication *)application {
 | 
			
		||||
  // Called as part of the transition from the background to the active state; here you can undo
 | 
			
		||||
  // many of the changes made on entering the background.
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)applicationDidBecomeActive:(UIApplication *)application {
 | 
			
		||||
  // Restart any tasks that were paused (or not yet started) while the application was inactive. If
 | 
			
		||||
  // the application was previously in the background, optionally refresh the user interface.
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)applicationWillTerminate:(UIApplication *)application {
 | 
			
		||||
  // Called when the application is about to terminate. Save data if appropriate. See also
 | 
			
		||||
  // applicationDidEnterBackground:.
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,99 @@
 | 
			
		|||
{
 | 
			
		||||
  "images" : [
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "20x20",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "20x20",
 | 
			
		||||
      "scale" : "3x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "29x29",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "29x29",
 | 
			
		||||
      "scale" : "3x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "40x40",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "40x40",
 | 
			
		||||
      "scale" : "3x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "60x60",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "60x60",
 | 
			
		||||
      "scale" : "3x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "20x20",
 | 
			
		||||
      "scale" : "1x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "20x20",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "29x29",
 | 
			
		||||
      "scale" : "1x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "29x29",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "40x40",
 | 
			
		||||
      "scale" : "1x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "40x40",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "76x76",
 | 
			
		||||
      "scale" : "1x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "76x76",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "83.5x83.5",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ios-marketing",
 | 
			
		||||
      "size" : "1024x1024",
 | 
			
		||||
      "scale" : "1x"
 | 
			
		||||
    }
 | 
			
		||||
  ],
 | 
			
		||||
  "info" : {
 | 
			
		||||
    "version" : 1,
 | 
			
		||||
    "author" : "xcode"
 | 
			
		||||
  }
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,7 @@
 | 
			
		|||
{
 | 
			
		||||
  "info" : {
 | 
			
		||||
    "version" : 1,
 | 
			
		||||
    "author" : "xcode"
 | 
			
		||||
  }
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
							
								
								
									
										75
									
								
								mediapipe/examples/ios/handdetectiongpu/BUILD
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,75 @@
 | 
			
		|||
# Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
#
 | 
			
		||||
# Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
# you may not use this file except in compliance with the License.
 | 
			
		||||
# You may obtain a copy of the License at
 | 
			
		||||
#
 | 
			
		||||
#      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
#
 | 
			
		||||
# Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
# distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
# See the License for the specific language governing permissions and
 | 
			
		||||
# limitations under the License.
 | 
			
		||||
 | 
			
		||||
licenses(["notice"])  # Apache 2.0
 | 
			
		||||
 | 
			
		||||
MIN_IOS_VERSION = "10.0"
 | 
			
		||||
 | 
			
		||||
load(
 | 
			
		||||
    "@build_bazel_rules_apple//apple:ios.bzl",
 | 
			
		||||
    "ios_application",
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
ios_application(
 | 
			
		||||
    name = "HandDetectionGpuApp",
 | 
			
		||||
    bundle_id = "com.google.mediapipe.HandDetectionGpu",
 | 
			
		||||
    families = [
 | 
			
		||||
        "iphone",
 | 
			
		||||
        "ipad",
 | 
			
		||||
    ],
 | 
			
		||||
    infoplists = ["Info.plist"],
 | 
			
		||||
    minimum_os_version = MIN_IOS_VERSION,
 | 
			
		||||
    provisioning_profile = "//mediapipe/examples/ios:provisioning_profile",
 | 
			
		||||
    deps = [
 | 
			
		||||
        ":HandDetectionGpuAppLibrary",
 | 
			
		||||
        "@ios_opencv//:OpencvFramework",
 | 
			
		||||
    ],
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
objc_library(
 | 
			
		||||
    name = "HandDetectionGpuAppLibrary",
 | 
			
		||||
    srcs = [
 | 
			
		||||
        "AppDelegate.m",
 | 
			
		||||
        "ViewController.mm",
 | 
			
		||||
        "main.m",
 | 
			
		||||
    ],
 | 
			
		||||
    hdrs = [
 | 
			
		||||
        "AppDelegate.h",
 | 
			
		||||
        "ViewController.h",
 | 
			
		||||
    ],
 | 
			
		||||
    data = [
 | 
			
		||||
        "Base.lproj/LaunchScreen.storyboard",
 | 
			
		||||
        "Base.lproj/Main.storyboard",
 | 
			
		||||
        "//mediapipe/graphs/hand_tracking:hand_detection_mobile_gpu_binary_graph",
 | 
			
		||||
        "//mediapipe/models:palm_detection.tflite",
 | 
			
		||||
        "//mediapipe/models:palm_detection_labelmap.txt",
 | 
			
		||||
    ],
 | 
			
		||||
    sdk_frameworks = [
 | 
			
		||||
        "AVFoundation",
 | 
			
		||||
        "CoreGraphics",
 | 
			
		||||
        "CoreMedia",
 | 
			
		||||
        "UIKit",
 | 
			
		||||
    ],
 | 
			
		||||
    deps = [
 | 
			
		||||
        "//mediapipe/objc:mediapipe_framework_ios",
 | 
			
		||||
        "//mediapipe/objc:mediapipe_input_sources_ios",
 | 
			
		||||
        "//mediapipe/objc:mediapipe_layer_renderer",
 | 
			
		||||
    ] + select({
 | 
			
		||||
        "//mediapipe:ios_i386": [],
 | 
			
		||||
        "//mediapipe:ios_x86_64": [],
 | 
			
		||||
        "//conditions:default": [
 | 
			
		||||
            "//mediapipe/graphs/hand_tracking:detection_mobile_calculators",
 | 
			
		||||
        ],
 | 
			
		||||
    }),
 | 
			
		||||
)
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,25 @@
 | 
			
		|||
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
 | 
			
		||||
<document type="com.apple.InterfaceBuilder3.CocoaTouch.Storyboard.XIB" version="3.0" toolsVersion="13122.16" targetRuntime="iOS.CocoaTouch" propertyAccessControl="none" useAutolayout="YES" launchScreen="YES" useTraitCollections="YES" useSafeAreas="YES" colorMatched="YES" initialViewController="01J-lp-oVM">
 | 
			
		||||
    <dependencies>
 | 
			
		||||
        <plugIn identifier="com.apple.InterfaceBuilder.IBCocoaTouchPlugin" version="13104.12"/>
 | 
			
		||||
        <capability name="Safe area layout guides" minToolsVersion="9.0"/>
 | 
			
		||||
        <capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
 | 
			
		||||
    </dependencies>
 | 
			
		||||
    <scenes>
 | 
			
		||||
        <!--View Controller-->
 | 
			
		||||
        <scene sceneID="EHf-IW-A2E">
 | 
			
		||||
            <objects>
 | 
			
		||||
                <viewController id="01J-lp-oVM" sceneMemberID="viewController">
 | 
			
		||||
                    <view key="view" contentMode="scaleToFill" id="Ze5-6b-2t3">
 | 
			
		||||
                        <rect key="frame" x="0.0" y="0.0" width="375" height="667"/>
 | 
			
		||||
                        <autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
 | 
			
		||||
                        <color key="backgroundColor" red="1" green="1" blue="1" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
 | 
			
		||||
                        <viewLayoutGuide key="safeArea" id="6Tk-OE-BBY"/>
 | 
			
		||||
                    </view>
 | 
			
		||||
                </viewController>
 | 
			
		||||
                <placeholder placeholderIdentifier="IBFirstResponder" id="iYj-Kq-Ea1" userLabel="First Responder" sceneMemberID="firstResponder"/>
 | 
			
		||||
            </objects>
 | 
			
		||||
            <point key="canvasLocation" x="53" y="375"/>
 | 
			
		||||
        </scene>
 | 
			
		||||
    </scenes>
 | 
			
		||||
</document>
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,41 @@
 | 
			
		|||
<?xml version="1.0" encoding="UTF-8"?>
 | 
			
		||||
<document type="com.apple.InterfaceBuilder3.CocoaTouch.Storyboard.XIB" version="3.0" toolsVersion="14490.70" targetRuntime="iOS.CocoaTouch" propertyAccessControl="none" useAutolayout="YES" useTraitCollections="YES" useSafeAreas="YES" colorMatched="YES" initialViewController="BYZ-38-t0r">
 | 
			
		||||
    <device id="retina4_7" orientation="portrait">
 | 
			
		||||
        <adaptation id="fullscreen"/>
 | 
			
		||||
    </device>
 | 
			
		||||
    <dependencies>
 | 
			
		||||
        <plugIn identifier="com.apple.InterfaceBuilder.IBCocoaTouchPlugin" version="14490.49"/>
 | 
			
		||||
        <capability name="Safe area layout guides" minToolsVersion="9.0"/>
 | 
			
		||||
        <capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
 | 
			
		||||
    </dependencies>
 | 
			
		||||
    <scenes>
 | 
			
		||||
        <!--View Controller-->
 | 
			
		||||
        <scene sceneID="tne-QT-ifu">
 | 
			
		||||
            <objects>
 | 
			
		||||
                <viewController id="BYZ-38-t0r" customClass="ViewController" sceneMemberID="viewController">
 | 
			
		||||
                    <view key="view" contentMode="scaleToFill" id="8bC-Xf-vdC">
 | 
			
		||||
                        <rect key="frame" x="0.0" y="0.0" width="375" height="667"/>
 | 
			
		||||
                        <autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
 | 
			
		||||
                        <subviews>
 | 
			
		||||
                            <view contentMode="scaleToFill" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="EfB-xq-knP">
 | 
			
		||||
                                <rect key="frame" x="0.0" y="20" width="375" height="647"/>
 | 
			
		||||
                                <autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
 | 
			
		||||
                                <color key="backgroundColor" white="0.0" alpha="1" colorSpace="custom" customColorSpace="genericGamma22GrayColorSpace"/>
 | 
			
		||||
                                <accessibility key="accessibilityConfiguration" label="PreviewDisplayView">
 | 
			
		||||
                                    <bool key="isElement" value="YES"/>
 | 
			
		||||
                                </accessibility>
 | 
			
		||||
                            </view>
 | 
			
		||||
                        </subviews>
 | 
			
		||||
                        <color key="backgroundColor" red="1" green="1" blue="1" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
 | 
			
		||||
                        <viewLayoutGuide key="safeArea" id="6Tk-OE-BBY"/>
 | 
			
		||||
                    </view>
 | 
			
		||||
                    <connections>
 | 
			
		||||
                        <outlet property="_liveView" destination="EfB-xq-knP" id="wac-VF-etz"/>
 | 
			
		||||
                    </connections>
 | 
			
		||||
                </viewController>
 | 
			
		||||
                <placeholder placeholderIdentifier="IBFirstResponder" id="dkx-z0-nzr" sceneMemberID="firstResponder"/>
 | 
			
		||||
            </objects>
 | 
			
		||||
            <point key="canvasLocation" x="48.799999999999997" y="20.239880059970016"/>
 | 
			
		||||
        </scene>
 | 
			
		||||
    </scenes>
 | 
			
		||||
</document>
 | 
			
		||||
							
								
								
									
										42
									
								
								mediapipe/examples/ios/handdetectiongpu/Info.plist
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,42 @@
 | 
			
		|||
<?xml version="1.0" encoding="UTF-8"?>
 | 
			
		||||
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
 | 
			
		||||
<plist version="1.0">
 | 
			
		||||
<dict>
 | 
			
		||||
  <key>NSCameraUsageDescription</key>
 | 
			
		||||
  <string>This app uses the camera to demonstrate live video processing.</string>
 | 
			
		||||
  <key>CFBundleDevelopmentRegion</key>
 | 
			
		||||
  <string>en</string>
 | 
			
		||||
  <key>CFBundleExecutable</key>
 | 
			
		||||
  <string>$(EXECUTABLE_NAME)</string>
 | 
			
		||||
  <key>CFBundleIdentifier</key>
 | 
			
		||||
  <string>$(PRODUCT_BUNDLE_IDENTIFIER)</string>
 | 
			
		||||
  <key>CFBundleInfoDictionaryVersion</key>
 | 
			
		||||
  <string>6.0</string>
 | 
			
		||||
  <key>CFBundleName</key>
 | 
			
		||||
  <string>$(PRODUCT_NAME)</string>
 | 
			
		||||
  <key>CFBundlePackageType</key>
 | 
			
		||||
  <string>APPL</string>
 | 
			
		||||
  <key>CFBundleShortVersionString</key>
 | 
			
		||||
  <string>1.0</string>
 | 
			
		||||
  <key>CFBundleVersion</key>
 | 
			
		||||
  <string>1</string>
 | 
			
		||||
  <key>LSRequiresIPhoneOS</key>
 | 
			
		||||
  <true/>
 | 
			
		||||
  <key>UILaunchStoryboardName</key>
 | 
			
		||||
  <string>LaunchScreen</string>
 | 
			
		||||
  <key>UIMainStoryboardFile</key>
 | 
			
		||||
  <string>Main</string>
 | 
			
		||||
  <key>UIRequiredDeviceCapabilities</key>
 | 
			
		||||
  <array>
 | 
			
		||||
    <string>armv7</string>
 | 
			
		||||
  </array>
 | 
			
		||||
  <key>UISupportedInterfaceOrientations</key>
 | 
			
		||||
  <array>
 | 
			
		||||
    <string>UIInterfaceOrientationPortrait</string>
 | 
			
		||||
  </array>
 | 
			
		||||
  <key>UISupportedInterfaceOrientations~ipad</key>
 | 
			
		||||
  <array>
 | 
			
		||||
    <string>UIInterfaceOrientationPortrait</string>
 | 
			
		||||
  </array>
 | 
			
		||||
</dict>
 | 
			
		||||
</plist>
 | 
			
		||||
							
								
								
									
										19
									
								
								mediapipe/examples/ios/handdetectiongpu/ViewController.h
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,19 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
#import <UIKit/UIKit.h>
 | 
			
		||||
 | 
			
		||||
@interface ViewController : UIViewController
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
							
								
								
									
										178
									
								
								mediapipe/examples/ios/handdetectiongpu/ViewController.mm
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,178 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
#import "ViewController.h"
 | 
			
		||||
 | 
			
		||||
#import "mediapipe/objc/MPPGraph.h"
 | 
			
		||||
#import "mediapipe/objc/MPPCameraInputSource.h"
 | 
			
		||||
#import "mediapipe/objc/MPPLayerRenderer.h"
 | 
			
		||||
 | 
			
		||||
static NSString* const kGraphName = @"hand_detection_mobile_gpu";
 | 
			
		||||
 | 
			
		||||
static const char* kInputStream = "input_video";
 | 
			
		||||
static const char* kOutputStream = "output_video";
 | 
			
		||||
static const char* kVideoQueueLabel = "com.google.mediapipe.example.videoQueue";
 | 
			
		||||
 | 
			
		||||
@interface ViewController () <MPPGraphDelegate, MPPInputSourceDelegate>
 | 
			
		||||
 | 
			
		||||
// The MediaPipe graph currently in use. Initialized in viewDidLoad, started in viewWillAppear: and
 | 
			
		||||
// sent video frames on _videoQueue.
 | 
			
		||||
@property(nonatomic) MPPGraph* mediapipeGraph;
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
 | 
			
		||||
@implementation ViewController {
 | 
			
		||||
  /// Handles camera access via AVCaptureSession library.
 | 
			
		||||
  MPPCameraInputSource* _cameraSource;
 | 
			
		||||
 | 
			
		||||
  /// Inform the user when camera is unavailable.
 | 
			
		||||
  IBOutlet UILabel* _noCameraLabel;
 | 
			
		||||
  /// Display the camera preview frames.
 | 
			
		||||
  IBOutlet UIView* _liveView;
 | 
			
		||||
  /// Render frames in a layer.
 | 
			
		||||
  MPPLayerRenderer* _renderer;
 | 
			
		||||
 | 
			
		||||
  /// Process camera frames on this queue.
 | 
			
		||||
  dispatch_queue_t _videoQueue;
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
#pragma mark - Cleanup methods
 | 
			
		||||
 | 
			
		||||
- (void)dealloc {
 | 
			
		||||
  self.mediapipeGraph.delegate = nil;
 | 
			
		||||
  [self.mediapipeGraph cancel];
 | 
			
		||||
  // Ignore errors since we're cleaning up.
 | 
			
		||||
  [self.mediapipeGraph closeAllInputStreamsWithError:nil];
 | 
			
		||||
  [self.mediapipeGraph waitUntilDoneWithError:nil];
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
#pragma mark - MediaPipe graph methods
 | 
			
		||||
 | 
			
		||||
+ (MPPGraph*)loadGraphFromResource:(NSString*)resource {
 | 
			
		||||
  // Load the graph config resource.
 | 
			
		||||
  NSError* configLoadError = nil;
 | 
			
		||||
  NSBundle* bundle = [NSBundle bundleForClass:[self class]];
 | 
			
		||||
  if (!resource || resource.length == 0) {
 | 
			
		||||
    return nil;
 | 
			
		||||
  }
 | 
			
		||||
  NSURL* graphURL = [bundle URLForResource:resource withExtension:@"binarypb"];
 | 
			
		||||
  NSData* data = [NSData dataWithContentsOfURL:graphURL options:0 error:&configLoadError];
 | 
			
		||||
  if (!data) {
 | 
			
		||||
    NSLog(@"Failed to load MediaPipe graph config: %@", configLoadError);
 | 
			
		||||
    return nil;
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  // Parse the graph config resource into mediapipe::CalculatorGraphConfig proto object.
 | 
			
		||||
  mediapipe::CalculatorGraphConfig config;
 | 
			
		||||
  config.ParseFromArray(data.bytes, data.length);
 | 
			
		||||
 | 
			
		||||
  // Create MediaPipe graph with mediapipe::CalculatorGraphConfig proto object.
 | 
			
		||||
  MPPGraph* newGraph = [[MPPGraph alloc] initWithGraphConfig:config];
 | 
			
		||||
  [newGraph addFrameOutputStream:kOutputStream outputPacketType:MediaPipePacketPixelBuffer];
 | 
			
		||||
  return newGraph;
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
#pragma mark - UIViewController methods
 | 
			
		||||
 | 
			
		||||
- (void)viewDidLoad {
 | 
			
		||||
  [super viewDidLoad];
 | 
			
		||||
 | 
			
		||||
  _renderer = [[MPPLayerRenderer alloc] init];
 | 
			
		||||
  _renderer.layer.frame = _liveView.layer.bounds;
 | 
			
		||||
  [_liveView.layer addSublayer:_renderer.layer];
 | 
			
		||||
  _renderer.frameScaleMode = MediaPipeFrameScaleFillAndCrop;
 | 
			
		||||
  // When using the front camera, mirror the input for a more natural look.
 | 
			
		||||
  _renderer.mirrored = YES;
 | 
			
		||||
 | 
			
		||||
  dispatch_queue_attr_t qosAttribute = dispatch_queue_attr_make_with_qos_class(
 | 
			
		||||
      DISPATCH_QUEUE_SERIAL, QOS_CLASS_USER_INTERACTIVE, /*relative_priority=*/0);
 | 
			
		||||
  _videoQueue = dispatch_queue_create(kVideoQueueLabel, qosAttribute);
 | 
			
		||||
 | 
			
		||||
  _cameraSource = [[MPPCameraInputSource alloc] init];
 | 
			
		||||
  [_cameraSource setDelegate:self queue:_videoQueue];
 | 
			
		||||
  _cameraSource.sessionPreset = AVCaptureSessionPresetHigh;
 | 
			
		||||
  _cameraSource.cameraPosition = AVCaptureDevicePositionFront;
 | 
			
		||||
  // The frame's native format is rotated with respect to the portrait orientation.
 | 
			
		||||
  _cameraSource.orientation = AVCaptureVideoOrientationPortrait;
 | 
			
		||||
 | 
			
		||||
  self.mediapipeGraph = [[self class] loadGraphFromResource:kGraphName];
 | 
			
		||||
  self.mediapipeGraph.delegate = self;
 | 
			
		||||
  // Set maxFramesInFlight to a small value to avoid memory contention for real-time processing.
 | 
			
		||||
  self.mediapipeGraph.maxFramesInFlight = 2;
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
// In this application, there is only one ViewController which has no navigation to other view
 | 
			
		||||
// controllers, and there is only one View with live display showing the result of running the
 | 
			
		||||
// MediaPipe graph on the live video feed. If more view controllers are needed later, the graph
 | 
			
		||||
// setup/teardown and camera start/stop logic should be updated appropriately in response to the
 | 
			
		||||
// appearance/disappearance of this ViewController, as viewWillAppear: can be invoked multiple times
 | 
			
		||||
// depending on the application navigation flow in that case.
 | 
			
		||||
- (void)viewWillAppear:(BOOL)animated {
 | 
			
		||||
  [super viewWillAppear:animated];
 | 
			
		||||
 | 
			
		||||
  [_cameraSource requestCameraAccessWithCompletionHandler:^void(BOOL granted) {
 | 
			
		||||
    if (granted) {
 | 
			
		||||
      [self startGraphAndCamera];
 | 
			
		||||
      dispatch_async(dispatch_get_main_queue(), ^{
 | 
			
		||||
        _noCameraLabel.hidden = YES;
 | 
			
		||||
      });
 | 
			
		||||
    }
 | 
			
		||||
  }];
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)startGraphAndCamera {
 | 
			
		||||
  // Start running self.mediapipeGraph.
 | 
			
		||||
  NSError* error;
 | 
			
		||||
  if (![self.mediapipeGraph startWithError:&error]) {
 | 
			
		||||
    NSLog(@"Failed to start graph: %@", error);
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  // Start fetching frames from the camera.
 | 
			
		||||
  dispatch_async(_videoQueue, ^{
 | 
			
		||||
    [_cameraSource start];
 | 
			
		||||
  });
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
#pragma mark - MPPGraphDelegate methods
 | 
			
		||||
 | 
			
		||||
// Receives CVPixelBufferRef from the MediaPipe graph. Invoked on a MediaPipe worker thread.
 | 
			
		||||
- (void)mediapipeGraph:(MPPGraph*)graph
 | 
			
		||||
    didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
 | 
			
		||||
              fromStream:(const std::string&)streamName {
 | 
			
		||||
  if (streamName == kOutputStream) {
 | 
			
		||||
    // Display the captured image on the screen.
 | 
			
		||||
    CVPixelBufferRetain(pixelBuffer);
 | 
			
		||||
    dispatch_async(dispatch_get_main_queue(), ^{
 | 
			
		||||
      [_renderer renderPixelBuffer:pixelBuffer];
 | 
			
		||||
      CVPixelBufferRelease(pixelBuffer);
 | 
			
		||||
    });
 | 
			
		||||
  }
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
#pragma mark - MPPInputSourceDelegate methods
 | 
			
		||||
 | 
			
		||||
// Must be invoked on _videoQueue.
 | 
			
		||||
- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
 | 
			
		||||
                timestamp:(CMTime)timestamp
 | 
			
		||||
               fromSource:(MPPInputSource*)source {
 | 
			
		||||
  if (source != _cameraSource) {
 | 
			
		||||
    NSLog(@"Unknown source: %@", source);
 | 
			
		||||
    return;
 | 
			
		||||
  }
 | 
			
		||||
  [self.mediapipeGraph sendPixelBuffer:imageBuffer
 | 
			
		||||
                            intoStream:kInputStream
 | 
			
		||||
                            packetType:MediaPipePacketPixelBuffer];
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
							
								
								
									
										22
									
								
								mediapipe/examples/ios/handdetectiongpu/main.m
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,22 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
#import <UIKit/UIKit.h>
 | 
			
		||||
#import "AppDelegate.h"
 | 
			
		||||
 | 
			
		||||
int main(int argc, char * argv[]) {
 | 
			
		||||
  @autoreleasepool {
 | 
			
		||||
      return UIApplicationMain(argc, argv, nil, NSStringFromClass([AppDelegate class]));
 | 
			
		||||
  }
 | 
			
		||||
}
 | 
			
		||||
							
								
								
									
										21
									
								
								mediapipe/examples/ios/handtrackinggpu/AppDelegate.h
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,21 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
#import <UIKit/UIKit.h>
 | 
			
		||||
 | 
			
		||||
@interface AppDelegate : UIResponder <UIApplicationDelegate>
 | 
			
		||||
 | 
			
		||||
@property(strong, nonatomic) UIWindow *window;
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
							
								
								
									
										59
									
								
								mediapipe/examples/ios/handtrackinggpu/AppDelegate.m
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,59 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
#import "AppDelegate.h"
 | 
			
		||||
 | 
			
		||||
@interface AppDelegate ()
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
 | 
			
		||||
@implementation AppDelegate
 | 
			
		||||
 | 
			
		||||
- (BOOL)application:(UIApplication *)application
 | 
			
		||||
    didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
 | 
			
		||||
  // Override point for customization after application launch.
 | 
			
		||||
  return YES;
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)applicationWillResignActive:(UIApplication *)application {
 | 
			
		||||
  // Sent when the application is about to move from active to inactive state. This can occur for
 | 
			
		||||
  // certain types of temporary interruptions (such as an incoming phone call or SMS message) or
 | 
			
		||||
  // when the user quits the application and it begins the transition to the background state. Use
 | 
			
		||||
  // this method to pause ongoing tasks, disable timers, and invalidate graphics rendering
 | 
			
		||||
  // callbacks. Games should use this method to pause the game.
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)applicationDidEnterBackground:(UIApplication *)application {
 | 
			
		||||
  // Use this method to release shared resources, save user data, invalidate timers, and store
 | 
			
		||||
  // enough application state information to restore your application to its current state in case
 | 
			
		||||
  // it is terminated later. If your application supports background execution, this method is
 | 
			
		||||
  // called instead of applicationWillTerminate: when the user quits.
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)applicationWillEnterForeground:(UIApplication *)application {
 | 
			
		||||
  // Called as part of the transition from the background to the active state; here you can undo
 | 
			
		||||
  // many of the changes made on entering the background.
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)applicationDidBecomeActive:(UIApplication *)application {
 | 
			
		||||
  // Restart any tasks that were paused (or not yet started) while the application was inactive. If
 | 
			
		||||
  // the application was previously in the background, optionally refresh the user interface.
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)applicationWillTerminate:(UIApplication *)application {
 | 
			
		||||
  // Called when the application is about to terminate. Save data if appropriate. See also
 | 
			
		||||
  // applicationDidEnterBackground:.
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,99 @@
 | 
			
		|||
{
 | 
			
		||||
  "images" : [
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "20x20",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "20x20",
 | 
			
		||||
      "scale" : "3x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "29x29",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "29x29",
 | 
			
		||||
      "scale" : "3x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "40x40",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "40x40",
 | 
			
		||||
      "scale" : "3x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "60x60",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "iphone",
 | 
			
		||||
      "size" : "60x60",
 | 
			
		||||
      "scale" : "3x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "20x20",
 | 
			
		||||
      "scale" : "1x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "20x20",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "29x29",
 | 
			
		||||
      "scale" : "1x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "29x29",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "40x40",
 | 
			
		||||
      "scale" : "1x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "40x40",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "76x76",
 | 
			
		||||
      "scale" : "1x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "76x76",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ipad",
 | 
			
		||||
      "size" : "83.5x83.5",
 | 
			
		||||
      "scale" : "2x"
 | 
			
		||||
    },
 | 
			
		||||
    {
 | 
			
		||||
      "idiom" : "ios-marketing",
 | 
			
		||||
      "size" : "1024x1024",
 | 
			
		||||
      "scale" : "1x"
 | 
			
		||||
    }
 | 
			
		||||
  ],
 | 
			
		||||
  "info" : {
 | 
			
		||||
    "version" : 1,
 | 
			
		||||
    "author" : "xcode"
 | 
			
		||||
  }
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,7 @@
 | 
			
		|||
{
 | 
			
		||||
  "info" : {
 | 
			
		||||
    "version" : 1,
 | 
			
		||||
    "author" : "xcode"
 | 
			
		||||
  }
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
							
								
								
									
										95
									
								
								mediapipe/examples/ios/handtrackinggpu/BUILD
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,95 @@
 | 
			
		|||
# Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
#
 | 
			
		||||
# Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
# you may not use this file except in compliance with the License.
 | 
			
		||||
# You may obtain a copy of the License at
 | 
			
		||||
#
 | 
			
		||||
#      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
#
 | 
			
		||||
# Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
# distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
# See the License for the specific language governing permissions and
 | 
			
		||||
# limitations under the License.
 | 
			
		||||
 | 
			
		||||
licenses(["notice"])  # Apache 2.0
 | 
			
		||||
 | 
			
		||||
MIN_IOS_VERSION = "10.0"
 | 
			
		||||
 | 
			
		||||
load(
 | 
			
		||||
    "@build_bazel_rules_apple//apple:ios.bzl",
 | 
			
		||||
    "ios_application",
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
# To use the 3D model instead of the default 2D model, add "--define 3D=true" to the
 | 
			
		||||
# bazel build command.
 | 
			
		||||
config_setting(
 | 
			
		||||
    name = "use_3d_model",
 | 
			
		||||
    define_values = {
 | 
			
		||||
        "3D": "true",
 | 
			
		||||
    },
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
genrule(
 | 
			
		||||
    name = "model",
 | 
			
		||||
    srcs = select({
 | 
			
		||||
        "//conditions:default": ["//mediapipe/models:hand_landmark.tflite"],
 | 
			
		||||
        ":use_3d_model": ["//mediapipe/models:hand_landmark_3d.tflite"],
 | 
			
		||||
    }),
 | 
			
		||||
    outs = ["hand_landmark.tflite"],
 | 
			
		||||
    cmd = "cp $< $@",
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
ios_application(
 | 
			
		||||
    name = "HandTrackingGpuApp",
 | 
			
		||||
    bundle_id = "com.google.mediapipe.HandTrackingGpu",
 | 
			
		||||
    families = [
 | 
			
		||||
        "iphone",
 | 
			
		||||
        "ipad",
 | 
			
		||||
    ],
 | 
			
		||||
    infoplists = ["Info.plist"],
 | 
			
		||||
    minimum_os_version = MIN_IOS_VERSION,
 | 
			
		||||
    provisioning_profile = "//mediapipe/examples/ios:provisioning_profile",
 | 
			
		||||
    deps = [
 | 
			
		||||
        ":HandTrackingGpuAppLibrary",
 | 
			
		||||
        "@ios_opencv//:OpencvFramework",
 | 
			
		||||
    ],
 | 
			
		||||
)
 | 
			
		||||
 | 
			
		||||
objc_library(
 | 
			
		||||
    name = "HandTrackingGpuAppLibrary",
 | 
			
		||||
    srcs = [
 | 
			
		||||
        "AppDelegate.m",
 | 
			
		||||
        "ViewController.mm",
 | 
			
		||||
        "main.m",
 | 
			
		||||
    ],
 | 
			
		||||
    hdrs = [
 | 
			
		||||
        "AppDelegate.h",
 | 
			
		||||
        "ViewController.h",
 | 
			
		||||
    ],
 | 
			
		||||
    data = [
 | 
			
		||||
        "Base.lproj/LaunchScreen.storyboard",
 | 
			
		||||
        "Base.lproj/Main.storyboard",
 | 
			
		||||
        ":model",
 | 
			
		||||
        "//mediapipe/graphs/hand_tracking:hand_tracking_mobile_gpu_binary_graph",
 | 
			
		||||
        "//mediapipe/models:palm_detection.tflite",
 | 
			
		||||
        "//mediapipe/models:palm_detection_labelmap.txt",
 | 
			
		||||
    ],
 | 
			
		||||
    sdk_frameworks = [
 | 
			
		||||
        "AVFoundation",
 | 
			
		||||
        "CoreGraphics",
 | 
			
		||||
        "CoreMedia",
 | 
			
		||||
        "UIKit",
 | 
			
		||||
    ],
 | 
			
		||||
    deps = [
 | 
			
		||||
        "//mediapipe/objc:mediapipe_framework_ios",
 | 
			
		||||
        "//mediapipe/objc:mediapipe_input_sources_ios",
 | 
			
		||||
        "//mediapipe/objc:mediapipe_layer_renderer",
 | 
			
		||||
    ] + select({
 | 
			
		||||
        "//mediapipe:ios_i386": [],
 | 
			
		||||
        "//mediapipe:ios_x86_64": [],
 | 
			
		||||
        "//conditions:default": [
 | 
			
		||||
            "//mediapipe/graphs/hand_tracking:mobile_calculators",
 | 
			
		||||
        ],
 | 
			
		||||
    }),
 | 
			
		||||
)
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,25 @@
 | 
			
		|||
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
 | 
			
		||||
<document type="com.apple.InterfaceBuilder3.CocoaTouch.Storyboard.XIB" version="3.0" toolsVersion="13122.16" targetRuntime="iOS.CocoaTouch" propertyAccessControl="none" useAutolayout="YES" launchScreen="YES" useTraitCollections="YES" useSafeAreas="YES" colorMatched="YES" initialViewController="01J-lp-oVM">
 | 
			
		||||
    <dependencies>
 | 
			
		||||
        <plugIn identifier="com.apple.InterfaceBuilder.IBCocoaTouchPlugin" version="13104.12"/>
 | 
			
		||||
        <capability name="Safe area layout guides" minToolsVersion="9.0"/>
 | 
			
		||||
        <capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
 | 
			
		||||
    </dependencies>
 | 
			
		||||
    <scenes>
 | 
			
		||||
        <!--View Controller-->
 | 
			
		||||
        <scene sceneID="EHf-IW-A2E">
 | 
			
		||||
            <objects>
 | 
			
		||||
                <viewController id="01J-lp-oVM" sceneMemberID="viewController">
 | 
			
		||||
                    <view key="view" contentMode="scaleToFill" id="Ze5-6b-2t3">
 | 
			
		||||
                        <rect key="frame" x="0.0" y="0.0" width="375" height="667"/>
 | 
			
		||||
                        <autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
 | 
			
		||||
                        <color key="backgroundColor" red="1" green="1" blue="1" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
 | 
			
		||||
                        <viewLayoutGuide key="safeArea" id="6Tk-OE-BBY"/>
 | 
			
		||||
                    </view>
 | 
			
		||||
                </viewController>
 | 
			
		||||
                <placeholder placeholderIdentifier="IBFirstResponder" id="iYj-Kq-Ea1" userLabel="First Responder" sceneMemberID="firstResponder"/>
 | 
			
		||||
            </objects>
 | 
			
		||||
            <point key="canvasLocation" x="53" y="375"/>
 | 
			
		||||
        </scene>
 | 
			
		||||
    </scenes>
 | 
			
		||||
</document>
 | 
			
		||||
| 
						 | 
				
			
			@ -0,0 +1,41 @@
 | 
			
		|||
<?xml version="1.0" encoding="UTF-8"?>
 | 
			
		||||
<document type="com.apple.InterfaceBuilder3.CocoaTouch.Storyboard.XIB" version="3.0" toolsVersion="14490.70" targetRuntime="iOS.CocoaTouch" propertyAccessControl="none" useAutolayout="YES" useTraitCollections="YES" useSafeAreas="YES" colorMatched="YES" initialViewController="BYZ-38-t0r">
 | 
			
		||||
    <device id="retina4_7" orientation="portrait">
 | 
			
		||||
        <adaptation id="fullscreen"/>
 | 
			
		||||
    </device>
 | 
			
		||||
    <dependencies>
 | 
			
		||||
        <plugIn identifier="com.apple.InterfaceBuilder.IBCocoaTouchPlugin" version="14490.49"/>
 | 
			
		||||
        <capability name="Safe area layout guides" minToolsVersion="9.0"/>
 | 
			
		||||
        <capability name="documents saved in the Xcode 8 format" minToolsVersion="8.0"/>
 | 
			
		||||
    </dependencies>
 | 
			
		||||
    <scenes>
 | 
			
		||||
        <!--View Controller-->
 | 
			
		||||
        <scene sceneID="tne-QT-ifu">
 | 
			
		||||
            <objects>
 | 
			
		||||
                <viewController id="BYZ-38-t0r" customClass="ViewController" sceneMemberID="viewController">
 | 
			
		||||
                    <view key="view" contentMode="scaleToFill" id="8bC-Xf-vdC">
 | 
			
		||||
                        <rect key="frame" x="0.0" y="0.0" width="375" height="667"/>
 | 
			
		||||
                        <autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
 | 
			
		||||
                        <subviews>
 | 
			
		||||
                            <view contentMode="scaleToFill" fixedFrame="YES" translatesAutoresizingMaskIntoConstraints="NO" id="EfB-xq-knP">
 | 
			
		||||
                                <rect key="frame" x="0.0" y="20" width="375" height="647"/>
 | 
			
		||||
                                <autoresizingMask key="autoresizingMask" widthSizable="YES" heightSizable="YES"/>
 | 
			
		||||
                                <color key="backgroundColor" white="0.0" alpha="1" colorSpace="custom" customColorSpace="genericGamma22GrayColorSpace"/>
 | 
			
		||||
                                <accessibility key="accessibilityConfiguration" label="PreviewDisplayView">
 | 
			
		||||
                                    <bool key="isElement" value="YES"/>
 | 
			
		||||
                                </accessibility>
 | 
			
		||||
                            </view>
 | 
			
		||||
                        </subviews>
 | 
			
		||||
                        <color key="backgroundColor" red="1" green="1" blue="1" alpha="1" colorSpace="custom" customColorSpace="sRGB"/>
 | 
			
		||||
                        <viewLayoutGuide key="safeArea" id="6Tk-OE-BBY"/>
 | 
			
		||||
                    </view>
 | 
			
		||||
                    <connections>
 | 
			
		||||
                        <outlet property="_liveView" destination="EfB-xq-knP" id="wac-VF-etz"/>
 | 
			
		||||
                    </connections>
 | 
			
		||||
                </viewController>
 | 
			
		||||
                <placeholder placeholderIdentifier="IBFirstResponder" id="dkx-z0-nzr" sceneMemberID="firstResponder"/>
 | 
			
		||||
            </objects>
 | 
			
		||||
            <point key="canvasLocation" x="48.799999999999997" y="20.239880059970016"/>
 | 
			
		||||
        </scene>
 | 
			
		||||
    </scenes>
 | 
			
		||||
</document>
 | 
			
		||||
							
								
								
									
										42
									
								
								mediapipe/examples/ios/handtrackinggpu/Info.plist
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,42 @@
 | 
			
		|||
<?xml version="1.0" encoding="UTF-8"?>
 | 
			
		||||
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
 | 
			
		||||
<plist version="1.0">
 | 
			
		||||
<dict>
 | 
			
		||||
  <key>NSCameraUsageDescription</key>
 | 
			
		||||
  <string>This app uses the camera to demonstrate live video processing.</string>
 | 
			
		||||
  <key>CFBundleDevelopmentRegion</key>
 | 
			
		||||
  <string>en</string>
 | 
			
		||||
  <key>CFBundleExecutable</key>
 | 
			
		||||
  <string>$(EXECUTABLE_NAME)</string>
 | 
			
		||||
  <key>CFBundleIdentifier</key>
 | 
			
		||||
  <string>$(PRODUCT_BUNDLE_IDENTIFIER)</string>
 | 
			
		||||
  <key>CFBundleInfoDictionaryVersion</key>
 | 
			
		||||
  <string>6.0</string>
 | 
			
		||||
  <key>CFBundleName</key>
 | 
			
		||||
  <string>$(PRODUCT_NAME)</string>
 | 
			
		||||
  <key>CFBundlePackageType</key>
 | 
			
		||||
  <string>APPL</string>
 | 
			
		||||
  <key>CFBundleShortVersionString</key>
 | 
			
		||||
  <string>1.0</string>
 | 
			
		||||
  <key>CFBundleVersion</key>
 | 
			
		||||
  <string>1</string>
 | 
			
		||||
  <key>LSRequiresIPhoneOS</key>
 | 
			
		||||
  <true/>
 | 
			
		||||
  <key>UILaunchStoryboardName</key>
 | 
			
		||||
  <string>LaunchScreen</string>
 | 
			
		||||
  <key>UIMainStoryboardFile</key>
 | 
			
		||||
  <string>Main</string>
 | 
			
		||||
  <key>UIRequiredDeviceCapabilities</key>
 | 
			
		||||
  <array>
 | 
			
		||||
    <string>armv7</string>
 | 
			
		||||
  </array>
 | 
			
		||||
  <key>UISupportedInterfaceOrientations</key>
 | 
			
		||||
  <array>
 | 
			
		||||
    <string>UIInterfaceOrientationPortrait</string>
 | 
			
		||||
  </array>
 | 
			
		||||
  <key>UISupportedInterfaceOrientations~ipad</key>
 | 
			
		||||
  <array>
 | 
			
		||||
    <string>UIInterfaceOrientationPortrait</string>
 | 
			
		||||
  </array>
 | 
			
		||||
</dict>
 | 
			
		||||
</plist>
 | 
			
		||||
							
								
								
									
										19
									
								
								mediapipe/examples/ios/handtrackinggpu/ViewController.h
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,19 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
#import <UIKit/UIKit.h>
 | 
			
		||||
 | 
			
		||||
@interface ViewController : UIViewController
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
							
								
								
									
										178
									
								
								mediapipe/examples/ios/handtrackinggpu/ViewController.mm
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,178 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
#import "ViewController.h"
 | 
			
		||||
 | 
			
		||||
#import "mediapipe/objc/MPPGraph.h"
 | 
			
		||||
#import "mediapipe/objc/MPPCameraInputSource.h"
 | 
			
		||||
#import "mediapipe/objc/MPPLayerRenderer.h"
 | 
			
		||||
 | 
			
		||||
static NSString* const kGraphName = @"hand_tracking_mobile_gpu";
 | 
			
		||||
 | 
			
		||||
static const char* kInputStream = "input_video";
 | 
			
		||||
static const char* kOutputStream = "output_video";
 | 
			
		||||
static const char* kVideoQueueLabel = "com.google.mediapipe.example.videoQueue";
 | 
			
		||||
 | 
			
		||||
@interface ViewController () <MPPGraphDelegate, MPPInputSourceDelegate>
 | 
			
		||||
 | 
			
		||||
// The MediaPipe graph currently in use. Initialized in viewDidLoad, started in viewWillAppear: and
 | 
			
		||||
// sent video frames on _videoQueue.
 | 
			
		||||
@property(nonatomic) MPPGraph* mediapipeGraph;
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
 | 
			
		||||
@implementation ViewController {
 | 
			
		||||
  /// Handles camera access via AVCaptureSession library.
 | 
			
		||||
  MPPCameraInputSource* _cameraSource;
 | 
			
		||||
 | 
			
		||||
  /// Inform the user when camera is unavailable.
 | 
			
		||||
  IBOutlet UILabel* _noCameraLabel;
 | 
			
		||||
  /// Display the camera preview frames.
 | 
			
		||||
  IBOutlet UIView* _liveView;
 | 
			
		||||
  /// Render frames in a layer.
 | 
			
		||||
  MPPLayerRenderer* _renderer;
 | 
			
		||||
 | 
			
		||||
  /// Process camera frames on this queue.
 | 
			
		||||
  dispatch_queue_t _videoQueue;
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
#pragma mark - Cleanup methods
 | 
			
		||||
 | 
			
		||||
- (void)dealloc {
 | 
			
		||||
  self.mediapipeGraph.delegate = nil;
 | 
			
		||||
  [self.mediapipeGraph cancel];
 | 
			
		||||
  // Ignore errors since we're cleaning up.
 | 
			
		||||
  [self.mediapipeGraph closeAllInputStreamsWithError:nil];
 | 
			
		||||
  [self.mediapipeGraph waitUntilDoneWithError:nil];
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
#pragma mark - MediaPipe graph methods
 | 
			
		||||
 | 
			
		||||
+ (MPPGraph*)loadGraphFromResource:(NSString*)resource {
 | 
			
		||||
  // Load the graph config resource.
 | 
			
		||||
  NSError* configLoadError = nil;
 | 
			
		||||
  NSBundle* bundle = [NSBundle bundleForClass:[self class]];
 | 
			
		||||
  if (!resource || resource.length == 0) {
 | 
			
		||||
    return nil;
 | 
			
		||||
  }
 | 
			
		||||
  NSURL* graphURL = [bundle URLForResource:resource withExtension:@"binarypb"];
 | 
			
		||||
  NSData* data = [NSData dataWithContentsOfURL:graphURL options:0 error:&configLoadError];
 | 
			
		||||
  if (!data) {
 | 
			
		||||
    NSLog(@"Failed to load MediaPipe graph config: %@", configLoadError);
 | 
			
		||||
    return nil;
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  // Parse the graph config resource into mediapipe::CalculatorGraphConfig proto object.
 | 
			
		||||
  mediapipe::CalculatorGraphConfig config;
 | 
			
		||||
  config.ParseFromArray(data.bytes, data.length);
 | 
			
		||||
 | 
			
		||||
  // Create MediaPipe graph with mediapipe::CalculatorGraphConfig proto object.
 | 
			
		||||
  MPPGraph* newGraph = [[MPPGraph alloc] initWithGraphConfig:config];
 | 
			
		||||
  [newGraph addFrameOutputStream:kOutputStream outputPacketType:MediaPipePacketPixelBuffer];
 | 
			
		||||
  return newGraph;
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
#pragma mark - UIViewController methods
 | 
			
		||||
 | 
			
		||||
- (void)viewDidLoad {
 | 
			
		||||
  [super viewDidLoad];
 | 
			
		||||
 | 
			
		||||
  _renderer = [[MPPLayerRenderer alloc] init];
 | 
			
		||||
  _renderer.layer.frame = _liveView.layer.bounds;
 | 
			
		||||
  [_liveView.layer addSublayer:_renderer.layer];
 | 
			
		||||
  _renderer.frameScaleMode = MediaPipeFrameScaleFillAndCrop;
 | 
			
		||||
  // When using the front camera, mirror the input for a more natural look.
 | 
			
		||||
  _renderer.mirrored = YES;
 | 
			
		||||
 | 
			
		||||
  dispatch_queue_attr_t qosAttribute = dispatch_queue_attr_make_with_qos_class(
 | 
			
		||||
      DISPATCH_QUEUE_SERIAL, QOS_CLASS_USER_INTERACTIVE, /*relative_priority=*/0);
 | 
			
		||||
  _videoQueue = dispatch_queue_create(kVideoQueueLabel, qosAttribute);
 | 
			
		||||
 | 
			
		||||
  _cameraSource = [[MPPCameraInputSource alloc] init];
 | 
			
		||||
  [_cameraSource setDelegate:self queue:_videoQueue];
 | 
			
		||||
  _cameraSource.sessionPreset = AVCaptureSessionPresetHigh;
 | 
			
		||||
  _cameraSource.cameraPosition = AVCaptureDevicePositionFront;
 | 
			
		||||
  // The frame's native format is rotated with respect to the portrait orientation.
 | 
			
		||||
  _cameraSource.orientation = AVCaptureVideoOrientationPortrait;
 | 
			
		||||
 | 
			
		||||
  self.mediapipeGraph = [[self class] loadGraphFromResource:kGraphName];
 | 
			
		||||
  self.mediapipeGraph.delegate = self;
 | 
			
		||||
  // Set maxFramesInFlight to a small value to avoid memory contention for real-time processing.
 | 
			
		||||
  self.mediapipeGraph.maxFramesInFlight = 2;
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
// In this application, there is only one ViewController which has no navigation to other view
 | 
			
		||||
// controllers, and there is only one View with live display showing the result of running the
 | 
			
		||||
// MediaPipe graph on the live video feed. If more view controllers are needed later, the graph
 | 
			
		||||
// setup/teardown and camera start/stop logic should be updated appropriately in response to the
 | 
			
		||||
// appearance/disappearance of this ViewController, as viewWillAppear: can be invoked multiple times
 | 
			
		||||
// depending on the application navigation flow in that case.
 | 
			
		||||
- (void)viewWillAppear:(BOOL)animated {
 | 
			
		||||
  [super viewWillAppear:animated];
 | 
			
		||||
 | 
			
		||||
  [_cameraSource requestCameraAccessWithCompletionHandler:^void(BOOL granted) {
 | 
			
		||||
    if (granted) {
 | 
			
		||||
      [self startGraphAndCamera];
 | 
			
		||||
      dispatch_async(dispatch_get_main_queue(), ^{
 | 
			
		||||
        _noCameraLabel.hidden = YES;
 | 
			
		||||
      });
 | 
			
		||||
    }
 | 
			
		||||
  }];
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
- (void)startGraphAndCamera {
 | 
			
		||||
  // Start running self.mediapipeGraph.
 | 
			
		||||
  NSError* error;
 | 
			
		||||
  if (![self.mediapipeGraph startWithError:&error]) {
 | 
			
		||||
    NSLog(@"Failed to start graph: %@", error);
 | 
			
		||||
  }
 | 
			
		||||
 | 
			
		||||
  // Start fetching frames from the camera.
 | 
			
		||||
  dispatch_async(_videoQueue, ^{
 | 
			
		||||
    [_cameraSource start];
 | 
			
		||||
  });
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
#pragma mark - MPPGraphDelegate methods
 | 
			
		||||
 | 
			
		||||
// Receives CVPixelBufferRef from the MediaPipe graph. Invoked on a MediaPipe worker thread.
 | 
			
		||||
- (void)mediapipeGraph:(MPPGraph*)graph
 | 
			
		||||
    didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
 | 
			
		||||
              fromStream:(const std::string&)streamName {
 | 
			
		||||
  if (streamName == kOutputStream) {
 | 
			
		||||
    // Display the captured image on the screen.
 | 
			
		||||
    CVPixelBufferRetain(pixelBuffer);
 | 
			
		||||
    dispatch_async(dispatch_get_main_queue(), ^{
 | 
			
		||||
      [_renderer renderPixelBuffer:pixelBuffer];
 | 
			
		||||
      CVPixelBufferRelease(pixelBuffer);
 | 
			
		||||
    });
 | 
			
		||||
  }
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
#pragma mark - MPPInputSourceDelegate methods
 | 
			
		||||
 | 
			
		||||
// Must be invoked on _videoQueue.
 | 
			
		||||
- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
 | 
			
		||||
                timestamp:(CMTime)timestamp
 | 
			
		||||
               fromSource:(MPPInputSource*)source {
 | 
			
		||||
  if (source != _cameraSource) {
 | 
			
		||||
    NSLog(@"Unknown source: %@", source);
 | 
			
		||||
    return;
 | 
			
		||||
  }
 | 
			
		||||
  [self.mediapipeGraph sendPixelBuffer:imageBuffer
 | 
			
		||||
                            intoStream:kInputStream
 | 
			
		||||
                            packetType:MediaPipePacketPixelBuffer];
 | 
			
		||||
}
 | 
			
		||||
 | 
			
		||||
@end
 | 
			
		||||
							
								
								
									
										22
									
								
								mediapipe/examples/ios/handtrackinggpu/main.m
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1,22 @@
 | 
			
		|||
// Copyright 2019 The MediaPipe Authors.
 | 
			
		||||
//
 | 
			
		||||
// Licensed under the Apache License, Version 2.0 (the "License");
 | 
			
		||||
// you may not use this file except in compliance with the License.
 | 
			
		||||
// You may obtain a copy of the License at
 | 
			
		||||
//
 | 
			
		||||
//      http://www.apache.org/licenses/LICENSE-2.0
 | 
			
		||||
//
 | 
			
		||||
// Unless required by applicable law or agreed to in writing, software
 | 
			
		||||
// distributed under the License is distributed on an "AS IS" BASIS,
 | 
			
		||||
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 | 
			
		||||
// See the License for the specific language governing permissions and
 | 
			
		||||
// limitations under the License.
 | 
			
		||||
 | 
			
		||||
#import <UIKit/UIKit.h>
 | 
			
		||||
#import "AppDelegate.h"
 | 
			
		||||
 | 
			
		||||
int main(int argc, char * argv[]) {
 | 
			
		||||
  @autoreleasepool {
 | 
			
		||||
      return UIApplicationMain(argc, argv, nil, NSStringFromClass([AppDelegate class]));
 | 
			
		||||
  }
 | 
			
		||||
}
 | 
			
		||||
							
								
								
									
										
											BIN
										
									
								
								mediapipe/models/hand_landmark.tflite
									
									
									
									
									
										Normal file
									
								
							
							
						
						
							
								
								
									
										
											BIN
										
									
								
								mediapipe/models/hand_landmark_3d.tflite
									
									
									
									
									
										Normal file
									
								
							
							
						
						
							
								
								
									
										
											BIN
										
									
								
								mediapipe/models/palm_detection.tflite
									
									
									
									
									
										Normal file
									
								
							
							
						
						
							
								
								
									
										1
									
								
								mediapipe/models/palm_detection_labelmap.txt
									
									
									
									
									
										Normal file
									
								
							
							
						
						| 
						 | 
				
			
			@ -0,0 +1 @@
 | 
			
		|||
Palm
 | 
			
		||||