diff --git a/README.md b/README.md
index c36179178..261213099 100644
--- a/README.md
+++ b/README.md
@@ -20,7 +20,7 @@ Check out the [Examples page](https://mediapipe.readthedocs.io/en/latest/example
A web-based visualizer is hosted on [viz.mediapipe.dev](https://viz.mediapipe.dev/). Please also see instructions [here](mediapipe/docs/visualizer.md).
## Community forum
-* [discuss](https://groups.google.com/forum/#!forum/mediapipe) - General community discussion around MediaPipe
+* [Discuss](https://groups.google.com/forum/#!forum/mediapipe) - General community discussion around MediaPipe
## Publications
* [MediaPipe: A Framework for Building Perception Pipelines](https://arxiv.org/abs/1906.08172)
@@ -29,7 +29,7 @@ A web-based visualizer is hosted on [viz.mediapipe.dev](https://viz.mediapipe.de
[Open sourced at CVPR 2019](https://sites.google.com/corp/view/perception-cv4arvr/mediapipe) on June 17~20 in Long Beach, CA
## Alpha Disclaimer
-MediaPipe is currently in alpha for v0.5. We are still making breaking API changes and expect to get to stable API by v1.0.
+MediaPipe is currently in alpha for v0.6. We are still making breaking API changes and expect to get to stable API by v1.0.
## Contributing
We welcome contributions. Please follow these [guidelines](./CONTRIBUTING.md).
diff --git a/_config.yml b/_config.yml
deleted file mode 100644
index 2f7efbeab..000000000
--- a/_config.yml
+++ /dev/null
@@ -1 +0,0 @@
-theme: jekyll-theme-minimal
\ No newline at end of file
diff --git a/mediapipe/docs/examples.md b/mediapipe/docs/examples.md
index 56b0ed51e..b31a995c3 100644
--- a/mediapipe/docs/examples.md
+++ b/mediapipe/docs/examples.md
@@ -22,7 +22,7 @@ Android example users go through in detail. It teaches the following:
### Hello World! on iOS
[Hello World! on iOS](./hello_world_ios.md) is the iOS version of Sobel edge
-detection example
+detection example.
### Object Detection with GPU
@@ -44,8 +44,9 @@ graphs can be easily adapted to run on CPU v.s. GPU.
[Face Detection with GPU](./face_detection_mobile_gpu.md) illustrates how to use
MediaPipe with a TFLite model for face detection in a GPU-accelerated pipeline.
The selfie face detection TFLite model is based on
-["BlazeFace: Sub-millisecond Neural Face Detection on Mobile GPUs"](https://sites.google.com/view/perception-cv4arvr/blazeface).
-[Model card](https://sites.google.com/corp/view/perception-cv4arvr/blazeface#h.p_21ojPZDx3cqq).
+["BlazeFace: Sub-millisecond Neural Face Detection on Mobile GPUs"](https://sites.google.com/view/perception-cv4arvr/blazeface),
+and model details are described in the
+[model card](https://sites.google.com/corp/view/perception-cv4arvr/blazeface#h.p_21ojPZDx3cqq).
* [Android](./face_detection_mobile_gpu.md#android)
* [iOS](./face_detection_mobile_gpu.md#ios)
@@ -71,8 +72,9 @@ MediaPipe with a TFLite model for hand tracking in a GPU-accelerated pipeline.
[Hair Segmentation on GPU](./hair_segmentation_mobile_gpu.md) illustrates how to
use MediaPipe with a TFLite model for hair segmentation in a GPU-accelerated
pipeline. The selfie hair segmentation TFLite model is based on
-["Real-time Hair segmentation and recoloring on Mobile GPUs"](https://sites.google.com/view/perception-cv4arvr/hair-segmentation).
-[Model card](https://sites.google.com/corp/view/perception-cv4arvr/hair-segmentation#h.p_NimuO7PgHxlY).
+["Real-time Hair segmentation and recoloring on Mobile GPUs"](https://sites.google.com/view/perception-cv4arvr/hair-segmentation),
+and model details are described in the
+[model card](https://sites.google.com/corp/view/perception-cv4arvr/hair-segmentation#h.p_NimuO7PgHxlY).
* [Android](./hair_segmentation_mobile_gpu.md#android)
diff --git a/mediapipe/docs/face_detection_mobile_gpu.md b/mediapipe/docs/face_detection_mobile_gpu.md
index 4bf7d6f0f..265797cf4 100644
--- a/mediapipe/docs/face_detection_mobile_gpu.md
+++ b/mediapipe/docs/face_detection_mobile_gpu.md
@@ -4,22 +4,22 @@ This doc focuses on the
[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/face_detection/face_detection_mobile_gpu.pbtxt)
that performs face detection with TensorFlow Lite on GPU.
-![face_detection_android_gpu_gif](images/mobile/face_detection_android_gpu.gif){width="300"}
+![face_detection_android_gpu_gif](images/mobile/face_detection_android_gpu.gif)
## Android
Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
general instructions to develop an Android application that uses MediaPipe.
-The graph is used in the
-[Face Detection GPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/facedetectiongpu)
-example app. To build the app, run:
+The graph below is used in the
+[Face Detection GPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/facedetectiongpu).
+To build the app, run:
```bash
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/facedetectiongpu
```
-To further install the app on android device, run:
+To further install the app on an Android device, run:
```bash
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/facedetectiongpu/facedetectiongpu.apk
@@ -28,13 +28,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
## iOS
Please see [Hello World! in MediaPipe on iOS](hello_world_ios.md) for general
-instructions to develop an iOS application that uses MediaPipe. The graph below
-is used in the
-[Face Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/facedetectiongpu).
+instructions to develop an iOS application that uses MediaPipe.
-To build the iOS app, please see the general
+The graph below is used in the
+[Face Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/facedetectiongpu).
+To build the app, please see the general
[MediaPipe iOS app building and setup instructions](./mediapipe_ios_setup.md).
-Specifically, run:
+Specific to this example, run:
```bash
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/facedetectiongpu:FaceDetectionGpuApp
@@ -42,11 +42,13 @@ bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/facedetectiongpu:Fa
## Graph
-![face_detection_mobile_gpu_graph](images/mobile/face_detection_mobile_gpu.png){width="400"}
+![face_detection_mobile_gpu_graph](images/mobile/face_detection_mobile_gpu.png)
To visualize the graph as shown above, copy the text specification of the graph
below and paste it into [MediaPipe Visualizer](https://viz.mediapipe.dev/).
+[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/face_detection/face_detection_mobile_gpu.pbtxt)
+
```bash
# MediaPipe graph that performs face detection with TensorFlow Lite on GPU.
# Used in the example in
diff --git a/mediapipe/docs/hair_segmentation_mobile_gpu.md b/mediapipe/docs/hair_segmentation_mobile_gpu.md
index 08a370b40..cdb8c3876 100644
--- a/mediapipe/docs/hair_segmentation_mobile_gpu.md
+++ b/mediapipe/docs/hair_segmentation_mobile_gpu.md
@@ -1,25 +1,25 @@
# Hair Segmentation (GPU)
This doc focuses on the
-[below example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hair_segmentation/hair_segmentation_android_gpu.pbtxt)
+[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hair_segmentation/hair_segmentation_mobile_gpu.pbtxt)
that performs hair segmentation with TensorFlow Lite on GPU.
-![hair_segmentation_android_gpu_gif](images/mobile/hair_segmentation_android_gpu.gif){width="300"}
+![hair_segmentation_android_gpu_gif](images/mobile/hair_segmentation_android_gpu.gif)
## Android
Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
general instructions to develop an Android application that uses MediaPipe.
-The graph is used in the
-[Hair Segmentation GPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/hairsegmentationgpu)
-example app. To build the app, run:
+The graph below is used in the
+[Hair Segmentation GPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/hairsegmentationgpu).
+To build the app, run:
```bash
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/hairsegmentationgpu
```
-To further install the app on android device, run:
+To further install the app on an Android device, run:
```bash
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/hairsegmentationgpu/hairsegmentationgpu.apk
@@ -27,11 +27,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
## Graph
-![hair_segmentation_mobile_gpu_graph](images/mobile/hair_segmentation_mobile_gpu.png){width="600"}
+![hair_segmentation_mobile_gpu_graph](images/mobile/hair_segmentation_mobile_gpu.png)
To visualize the graph as shown above, copy the text specification of the graph
below and paste it into [MediaPipe Visualizer](https://viz.mediapipe.dev/).
+[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hair_segmentation/hair_segmentation_mobile_gpu.pbtxt)
+
```bash
# MediaPipe graph that performs hair segmentation with TensorFlow Lite on GPU.
# Used in the example in
diff --git a/mediapipe/docs/hand_detection_mobile_gpu.md b/mediapipe/docs/hand_detection_mobile_gpu.md
index 2dcd4df70..aee637570 100644
--- a/mediapipe/docs/hand_detection_mobile_gpu.md
+++ b/mediapipe/docs/hand_detection_mobile_gpu.md
@@ -2,30 +2,36 @@
This doc focuses on the
[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_gpu.pbtxt)
-that performs hand detection with TensorFlow Lite on GPU. This hand detection
-example is related to
-[hand tracking GPU example](./hand_tracking_mobile_gpu.md). Here is the
-[model card](https://mediapipe.page.link/handmc) for hand detection.
+that performs hand detection with TensorFlow Lite on GPU. It is related to the
+[hand tracking example](./hand_tracking_mobile_gpu.md).
-For overall context on hand detection and hand tracking, please read
-[this Google AI blog post](https://mediapipe.page.link/handgoogleaiblog).
+For overall context on hand detection and hand tracking, please read this
+[Google AI Blog post](https://mediapipe.page.link/handgoogleaiblog).
-![hand_detection_android_gpu_gif](images/mobile/hand_detection_android_gpu.gif){width="300"}
+![hand_detection_android_gpu_gif](images/mobile/hand_detection_android_gpu.gif)
+
+In the visualization above, green boxes represent the results of palm detection,
+and the red box represents the extended hand rectangle designed to cover the
+entire hand. The palm detection ML model (see also
+[model card](https://mediapipe.page.link/handmc)) supports detection of multiple
+palms, and this example selects only the one with the highest detection
+confidence score to generate the hand rectangle, to be further utilized in the
+[hand tracking example](./hand_tracking_mobile_gpu.md).
## Android
Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
general instructions to develop an Android application that uses MediaPipe.
-The graph is used in the
-[Hand Detection GPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu)
-example app. To build the app, run:
+The graph below is used in the
+[Hand Detection GPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu).
+To build the app, run:
```bash
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu
```
-To further install the app on android device, run:
+To further install the app on an Android device, run:
```bash
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/handdetectiongpu.apk
@@ -34,13 +40,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
## iOS
Please see [Hello World! in MediaPipe on iOS](hello_world_ios.md) for general
-instructions to develop an iOS application that uses MediaPipe. The graph below
-is used in the
-[Hand Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/handdetectiongpu)
+instructions to develop an iOS application that uses MediaPipe.
-To build the iOS app, please see the general
+The graph below is used in the
+[Hand Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/handdetectiongpu).
+To build the app, please see the general
[MediaPipe iOS app building and setup instructions](./mediapipe_ios_setup.md).
-Specifically, run:
+Specific to this example, run:
```bash
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/handdetectiongpu:HandDetectionGpuApp
@@ -48,17 +54,18 @@ bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/handdetectiongpu:Ha
## Graph
-The hand detection graph is
-[hand_detection_mobile.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_mobile.pbtxt)
-and it includes a [HandDetectionSubgraph](./framework_concepts.md#subgraph) with
-filename
-[hand_detection_gpu.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_gpu.pbtxt)
-shown as a box called `HandDetection` in purple
+The hand detection [main graph](#main-graph) internally utilizes a
+[hand detection subgraph](#hand-detection-subgraph). The subgraph shows up in
+the main graph visualization as the `HandDetection` node colored in purple, and
+the subgraph itself can also be visualized just like a regular graph. For more
+information on how to visualize a graph that includes subgraphs, see
+[visualizing subgraphs](./visualizer.md#visualizing-subgraphs).
-For more information on how to visualize a graph that includes subgraphs, see
-[subgraph documentation](./visualizer.md#visualizing-subgraphs) for Visualizer.
+### Main Graph
-![hand_detection_mobile_graph](images/mobile/hand_detection_mobile.png){width="500"}
+![hand_detection_mobile_graph](images/mobile/hand_detection_mobile.png)
+
+[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_mobile.pbtxt)
```bash
# MediaPipe graph that performs hand detection with TensorFlow Lite on GPU.
@@ -125,9 +132,15 @@ node {
}
```
-![hand_detection_gpu_subgraph](images/mobile/hand_detection_gpu_subgraph.png){width="500"}
+### Hand Detection Subgraph
+
+![hand_detection_gpu_subgraph](images/mobile/hand_detection_gpu_subgraph.png)
+
+[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_gpu.pbtxt)
```bash
+# MediaPipe hand detection subgraph.
+
type: "HandDetectionSubgraph"
input_stream: "input_video"
diff --git a/mediapipe/docs/hand_tracking_mobile_gpu.md b/mediapipe/docs/hand_tracking_mobile_gpu.md
index ec6d833e4..4d8ff4e0d 100644
--- a/mediapipe/docs/hand_tracking_mobile_gpu.md
+++ b/mediapipe/docs/hand_tracking_mobile_gpu.md
@@ -1,32 +1,41 @@
# Hand Tracking (GPU)
This doc focuses on the
-[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_android_gpu.pbtxt)
-that performs hand tracking with TensorFlow Lite on GPU. This hand tracking
-example is related to
-[hand detection GPU example](./hand_detection_mobile_gpu.md). We recommend users
-to review the hand detection GPU example first. Here is the
-[model card](https://mediapipe.page.link/handmc) for hand tracking.
+[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_tracking_mobile.pbtxt)
+that performs hand tracking with TensorFlow Lite on GPU. It is related to the
+[hand detection example](./hand_detection_mobile_gpu.md), and we recommend users
+to review the hand detection example first.
-For overall context on hand detection and hand tracking, please read
-[this Google AI blog post](https://mediapipe.page.link/handgoogleaiblog).
+For overall context on hand detection and hand tracking, please read this
+[Google AI Blog post](https://mediapipe.page.link/handgoogleaiblog).
-![hand_tracking_android_gpu.gif](images/mobile/hand_tracking_android_gpu.gif){width="300"}
+![hand_tracking_android_gpu.gif](images/mobile/hand_tracking_android_gpu.gif)
+
+In the visualization above, the red dots represent the localized hand landmarks,
+and the green lines are simply connections between selected landmark pairs for
+visualization of the hand skeleton. The red box represents a hand rectangle that
+covers the entire hand, derived either from hand detection (see
+[hand detection example](./hand_detection_mobile_gpu.md)) or from the pervious
+round of hand landmark localization using an ML model (see also
+[model card](https://mediapipe.page.link/handmc)). Hand landmark localization is
+performed only within the hand rectangle for computational efficiency and
+accuracy, and hand detection is only invoked when landmark localization could
+not identify hand presence in the previous iteration.
## Android
Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
general instructions to develop an Android application that uses MediaPipe.
-The graph is used in the
-[Hand Tracking GPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu)
-example app. To build the app, run:
+The graph below is used in the
+[Hand Tracking GPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu).
+To build the app, run:
```bash
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu
```
-To further install the app on android device, run:
+To further install the app on an Android device, run:
```bash
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/handtrackinggpu.apk
@@ -35,13 +44,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
## iOS
Please see [Hello World! in MediaPipe on iOS](hello_world_ios.md) for general
-instructions to develop an iOS application that uses MediaPipe. The graph below
-is used in the
-[Hand Tracking GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/handtrackinggpu)
+instructions to develop an iOS application that uses MediaPipe.
-To build the iOS app, please see the general
+The graph below is used in the
+[Hand Tracking GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/handtrackinggpu).
+To build the app, please see the general
[MediaPipe iOS app building and setup instructions](./mediapipe_ios_setup.md).
-Specifically, run:
+Specific to this example, run:
```bash
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/handtrackinggpu:HandTrackingGpuApp
@@ -49,20 +58,21 @@ bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/handtrackinggpu:Han
## Graph
-For more information on how to visualize a graph that includes subgraphs, see
-[subgraph documentation](./visualizer.md#visualizing-subgraphs) for Visualizer.
+The hand tracking [main graph](#main-graph) internally utilizes a
+[hand detection subgraph](#hand-detection-subgraph), a
+[hand landmark subgraph](#hand-landmark-subgraph) and a
+[renderer subgraph](#renderer-subgraph).
-The hand tracking graph is
-[hand_tracking_mobile.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_tracking_mobile.pbtxt)
-and it includes 3 [subgraphs](./framework_concepts.md#subgraph):
+The subgraphs show up in the main graph visualization as nodes colored in
+purple, and the subgraph itself can also be visualized just like a regular
+graph. For more information on how to visualize a graph that includes subgraphs,
+see [visualizing subgraphs](./visualizer.md#visualizing-subgraphs).
-* [HandDetectionSubgraph - hand_detection_gpu.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_gpu.pbtxt)
+### Main Graph
-* [HandLandmarkSubgraph - hand_landmark_gpu.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_landmark_gpu.pbtxt)
+![hand_tracking_mobile_graph](images/mobile/hand_tracking_mobile.png)
-* [RendererSubgraph - renderer_gpu.pbtxt](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/renderer_gpu.pbtxt)
-
-![hand_tracking_mobile_graph](images/mobile/hand_tracking_mobile.png){width="400"}
+[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_tracking_mobile.pbtxt)
```bash
# MediaPipe graph that performs hand tracking with TensorFlow Lite on GPU.
@@ -152,9 +162,15 @@ node {
}
```
-![hand_detection_gpu_subgraph](images/mobile/hand_detection_gpu_subgraph.png){width="500"}
+### Hand Detection Subgraph
+
+![hand_detection_gpu_subgraph](images/mobile/hand_detection_gpu_subgraph.png)
+
+[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_detection_gpu.pbtxt)
```bash
+# MediaPipe hand detection subgraph.
+
type: "HandDetectionSubgraph"
input_stream: "input_video"
@@ -352,7 +368,11 @@ node {
}
```
-![hand_landmark_gpu_subgraph.pbtxt](images/mobile/hand_landmark_gpu_subgraph.png){width="400"}
+### Hand Landmark Subgraph
+
+![hand_landmark_gpu_subgraph.pbtxt](images/mobile/hand_landmark_gpu_subgraph.png)
+
+[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/hand_landmark_gpu.pbtxt)
```bash
# MediaPipe hand landmark localization subgraph.
@@ -532,7 +552,11 @@ node {
}
```
-![hand_renderer_gpu_subgraph.pbtxt](images/mobile/hand_renderer_gpu_subgraph.png){width="500"}
+### Renderer Subgraph
+
+![hand_renderer_gpu_subgraph.pbtxt](images/mobile/hand_renderer_gpu_subgraph.png)
+
+[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/hand_tracking/renderer_gpu.pbtxt)
```bash
# MediaPipe hand tracking rendering subgraph.
diff --git a/mediapipe/docs/hello_world_android.md b/mediapipe/docs/hello_world_android.md
index ef78f92ed..e44ae92d2 100644
--- a/mediapipe/docs/hello_world_android.md
+++ b/mediapipe/docs/hello_world_android.md
@@ -14,7 +14,7 @@ graph on Android.
A simple camera app for real-time Sobel edge detection applied to a live video
stream on an Android device.
-![edge_detection_android_gpu_gif](images/mobile/edge_detection_android_gpu.gif){width="300"}
+![edge_detection_android_gpu_gif](images/mobile/edge_detection_android_gpu.gif)
## Setup
@@ -56,7 +56,7 @@ node: {
A visualization of the graph is shown below:
-![edge_detection_mobile_gpu_graph](images/mobile/edge_detection_mobile_graph_gpu.png){width="200"}
+![edge_detection_mobile_gpu](images/mobile/edge_detection_mobile_gpu.png)
This graph has a single input stream named `input_video` for all incoming frames
that will be provided by your device's camera.
@@ -252,7 +252,7 @@ adb install bazel-bin/$APPLICATION_PATH/edgedetectiongpu.apk
Open the application on your device. It should display a screen with the text
`Hello World!`.
-![bazel_hello_world_android](images/mobile/bazel_hello_world_android.png){width="300"}
+![bazel_hello_world_android](images/mobile/bazel_hello_world_android.png)
## Using the camera via `CameraX`
@@ -369,7 +369,7 @@ Add the following line in the `$APPLICATION_PATH/res/values/strings.xml` file:
When the user doesn't grant camera permission, the screen will now look like
this:
-![missing_camera_permission_android](images/mobile/missing_camera_permission_android.png){width="300"}
+![missing_camera_permission_android](images/mobile/missing_camera_permission_android.png)
Now, we will add the [`SurfaceTexture`] and [`SurfaceView`] objects to
`MainActivity`:
@@ -709,7 +709,7 @@ And that's it! You should now be able to successfully build and run the
application on the device and see Sobel edge detection running on a live camera
feed! Congrats!
-![edge_detection_android_gpu_gif](images/mobile/edge_detection_android_gpu.gif){width="300"}
+![edge_detection_android_gpu_gif](images/mobile/edge_detection_android_gpu.gif)
If you ran into any issues, please see the full code of the tutorial
[here](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/edgedetectiongpu).
diff --git a/mediapipe/docs/hello_world_desktop.md b/mediapipe/docs/hello_world_desktop.md
index 9f3617dfe..236b8675e 100644
--- a/mediapipe/docs/hello_world_desktop.md
+++ b/mediapipe/docs/hello_world_desktop.md
@@ -72,7 +72,7 @@
This graph consists of 1 graph input stream (`in`) and 1 graph output stream
(`out`), and 2 [`PassThroughCalculator`]s connected serially.
- ![hello_world.cc graph](./images/hello_world_graph.png){width="200"}
+ ![hello_world graph](./images/hello_world.png)
4. Before running the graph, an `OutputStreamPoller` object is connected to the
output stream in order to later retrieve the graph output, and a graph run
diff --git a/mediapipe/docs/hello_world_ios.md b/mediapipe/docs/hello_world_ios.md
index a8a8791e7..c79dac57f 100644
--- a/mediapipe/docs/hello_world_ios.md
+++ b/mediapipe/docs/hello_world_ios.md
@@ -14,7 +14,7 @@ graph on iOS.
A simple camera app for real-time Sobel edge detection applied to a live video
stream on an iOS device.
-![edge_detection_ios_gpu_gif](images/mobile/edge_detection_ios_gpu.gif){width="300"}
+![edge_detection_ios_gpu_gif](images/mobile/edge_detection_ios_gpu.gif)
## Setup
@@ -54,7 +54,7 @@ node: {
A visualization of the graph is shown below:
-![edge_detection_mobile_gpu_graph](images/mobile/edge_detection_mobile_graph_gpu.png){width="200"}
+![edge_detection_mobile_gpu](images/mobile/edge_detection_mobile_gpu.png)
This graph has a single input stream named `input_video` for all incoming frames
that will be provided by your device's camera.
@@ -174,10 +174,11 @@ bazel build -c opt --config=ios_arm64 <$APPLICATION_PATH>:EdgeDetectionGpuApp'
```
For example, to build the `EdgeDetectionGpuApp` application in
-`mediapipe/examples/ios/edgedetection`, use the following command:
+`mediapipe/examples/ios/edgedetectiongpu`, use the following
+command:
```
-bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/edgedetection:EdgeDetectionGpuApp
+bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/edgedetectiongpu:EdgeDetectionGpuApp
```
Then, go back to XCode, open Window > Devices and Simulators, select your
@@ -188,9 +189,9 @@ blank white screen.
## Use the camera for the live view feed
-In this tutorial, we will use the `MediaPipeCameraInputSource` class to access
-and grab frames from the camera. This class uses the `AVCaptureSession` API to
-get the frames from the camera.
+In this tutorial, we will use the `MPPCameraInputSource` class to access and
+grab frames from the camera. This class uses the `AVCaptureSession` API to get
+the frames from the camera.
But before using this class, change the `Info.plist` file to support camera
usage in the app.
@@ -198,7 +199,7 @@ usage in the app.
In `ViewController.m`, add the following import line:
```
-#import "mediapipe/objc/MediaPipeCameraInputSource.h"
+#import "mediapipe/objc/MPPCameraInputSource.h"
```
Add the following to its implementation block to create an object
@@ -207,7 +208,7 @@ Add the following to its implementation block to create an object
```
@implementation ViewController {
// Handles camera access via AVCaptureSession library.
- MediaPipeCameraInputSource* _cameraSource;
+ MPPCameraInputSource* _cameraSource;
}
```
@@ -217,7 +218,7 @@ Add the following code to `viewDidLoad()`:
-(void)viewDidLoad {
[super viewDidLoad];
- _cameraSource = [[MediaPipeCameraInputSource alloc] init];
+ _cameraSource = [[MPPCameraInputSource alloc] init];
_cameraSource.sessionPreset = AVCaptureSessionPresetHigh;
_cameraSource.cameraPosition = AVCaptureDevicePositionBack;
// The frame's native format is rotated with respect to the portrait orientation.
@@ -229,10 +230,10 @@ The code initializes `_cameraSource`, sets the capture session preset, and which
camera to use.
We need to get frames from the `_cameraSource` into our application
-`ViewController` to display them. `MediaPipeCameraInputSource` is a subclass of
-`MediaPipeInputSource`, which provides a protocol for its delegates, namely the
-`MediaPipeInputSourceDelegate`. So our application `ViewController` can be a
-delegate of `_cameraSource`.
+`ViewController` to display them. `MPPCameraInputSource` is a subclass of
+`MPPInputSource`, which provides a protocol for its delegates, namely the
+`MPPInputSourceDelegate`. So our application `ViewController` can be a delegate
+of `_cameraSource`.
To handle camera setup and process incoming frames, we should use a queue
different from the main queue. Add the following to the implementation block of
@@ -269,11 +270,11 @@ the interface/implementation of the `ViewController`:
static const char* kVideoQueueLabel = "com.google.mediapipe.example.videoQueue";
```
-Before implementing any method from `MediaPipeInputSourceDelegate` protocol, we
-must first set up a way to display the camera frames. MediaPipe provides another
-utility called `MediaPipeLayerRenderer` to display images on the screen. This
-utility can be used to display `CVPixelBufferRef` objects, which is the type of
-the images provided by `MediaPipeCameraInputSource` to its delegates.
+Before implementing any method from `MPPInputSourceDelegate` protocol, we must
+first set up a way to display the camera frames. MediaPipe provides another
+utility called `MPPLayerRenderer` to display images on the screen. This utility
+can be used to display `CVPixelBufferRef` objects, which is the type of the
+images provided by `MPPCameraInputSource` to its delegates.
To display images of the screen, we need to add a new `UIView` object called
`_liveView` to the `ViewController`.
@@ -284,7 +285,7 @@ Add the following lines to the implementation block of the `ViewController`:
// Display the camera preview frames.
IBOutlet UIView* _liveView;
// Render frames in a layer.
-MediaPipeLayerRenderer* _renderer;
+MPPLayerRenderer* _renderer;
```
Go to `Main.storyboard`, add a `UIView` object from the object library to the
@@ -296,7 +297,7 @@ Go back to `ViewController.m` and add the following code to `viewDidLoad()` to
initialize the `_renderer` object:
```
-_renderer = [[MediaPipeLayerRenderer alloc] init];
+_renderer = [[MPPLayerRenderer alloc] init];
_renderer.layer.frame = _liveView.layer.bounds;
[_liveView.layer addSublayer:_renderer.layer];
_renderer.frameScaleMode = MediaPipeFrameScaleFillAndCrop;
@@ -308,7 +309,7 @@ To get frames from the camera, we will implement the following method:
// Must be invoked on _videoQueue.
- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
timestamp:(CMTime)timestamp
- fromSource:(MediaPipeInputSource*)source {
+ fromSource:(MPPInputSource*)source {
if (source != _cameraSource) {
NSLog(@"Unknown source: %@", source);
return;
@@ -322,7 +323,7 @@ To get frames from the camera, we will implement the following method:
}
```
-This is a delegate method of `MediaPipeInputSource`. We first check that we are
+This is a delegate method of `MPPInputSource`. We first check that we are
getting frames from the right source, i.e. the `_cameraSource`. Then we display
the frame received from the camera via `_renderer` on the main queue.
@@ -337,7 +338,7 @@ about to appear. To do this, we will implement the
```
Before we start running the camera, we need the user's permission to access it.
-`MediaPipeCameraInputSource` provides a function
+`MPPCameraInputSource` provides a function
`requestCameraAccessWithCompletionHandler:(void (^_Nullable)(BOOL
granted))handler` to request camera access and do some work when the user has
responded. Add the following code to `viewWillAppear:animated`:
@@ -413,7 +414,7 @@ Add the following property to the interface of the `ViewController`:
```
// The MediaPipe graph currently in use. Initialized in viewDidLoad, started in viewWillAppear: and
// sent video frames on _videoQueue.
-@property(nonatomic) MediaPipeGraph* mediapipeGraph;
+@property(nonatomic) MPPGraph* mediapipeGraph;
```
As explained in the comment above, we will initialize this graph in
@@ -421,7 +422,7 @@ As explained in the comment above, we will initialize this graph in
using the following function:
```
-+ (MediaPipeGraph*)loadGraphFromResource:(NSString*)resource {
++ (MPPGraph*)loadGraphFromResource:(NSString*)resource {
// Load the graph config resource.
NSError* configLoadError = nil;
NSBundle* bundle = [NSBundle bundleForClass:[self class]];
@@ -440,7 +441,7 @@ using the following function:
config.ParseFromArray(data.bytes, data.length);
// Create MediaPipe graph with mediapipe::CalculatorGraphConfig proto object.
- MediaPipeGraph* newGraph = [[MediaPipeGraph alloc] initWithGraphConfig:config];
+ MPPGraph* newGraph = [[MPPGraph alloc] initWithGraphConfig:config];
[newGraph addFrameOutputStream:kOutputStream outputPacketType:MediaPipePacketPixelBuffer];
return newGraph;
}
@@ -498,7 +499,7 @@ this function's implementation to do the following:
```
- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
timestamp:(CMTime)timestamp
- fromSource:(MediaPipeInputSource*)source {
+ fromSource:(MPPInputSource*)source {
if (source != _cameraSource) {
NSLog(@"Unknown source: %@", source);
return;
@@ -518,9 +519,9 @@ The graph will run with this input packet and output a result in
method to receive packets on this output stream and display them on the screen:
```
-- (void)mediapipeGraph:(MediaPipeGraph*)graph
- didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
- fromStream:(const std::string&)streamName {
+- (void)mediapipeGraph:(MPPGraph*)graph
+ didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
+ fromStream:(const std::string&)streamName {
if (streamName == kOutputStream) {
// Display the captured image on the screen.
CVPixelBufferRetain(pixelBuffer);
@@ -535,7 +536,7 @@ method to receive packets on this output stream and display them on the screen:
And that is all! Build and run the app on your iOS device. You should see the
results of running the edge detection graph on a live video feed. Congrats!
-![edge_detection_ios_gpu_gif](images/mobile/edge_detection_ios_gpu.gif){width="300"}
+![edge_detection_ios_gpu_gif](images/mobile/edge_detection_ios_gpu.gif)
If you ran into any issues, please see the full code of the tutorial
[here](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/edgedetectiongpu).
diff --git a/mediapipe/docs/images/hello_world.png b/mediapipe/docs/images/hello_world.png
new file mode 100644
index 000000000..1005d7ffc
Binary files /dev/null and b/mediapipe/docs/images/hello_world.png differ
diff --git a/mediapipe/docs/images/hello_world_graph.png b/mediapipe/docs/images/hello_world_graph.png
deleted file mode 100644
index c36aafc08..000000000
Binary files a/mediapipe/docs/images/hello_world_graph.png and /dev/null differ
diff --git a/mediapipe/docs/images/mobile/bazel_hello_world_android.png b/mediapipe/docs/images/mobile/bazel_hello_world_android.png
index dd50be3e7..758e68cb8 100644
Binary files a/mediapipe/docs/images/mobile/bazel_hello_world_android.png and b/mediapipe/docs/images/mobile/bazel_hello_world_android.png differ
diff --git a/mediapipe/docs/images/mobile/edge_detection_android_gpu.gif b/mediapipe/docs/images/mobile/edge_detection_android_gpu.gif
index 4192eb224..a78a39876 100644
Binary files a/mediapipe/docs/images/mobile/edge_detection_android_gpu.gif and b/mediapipe/docs/images/mobile/edge_detection_android_gpu.gif differ
diff --git a/mediapipe/docs/images/mobile/edge_detection_ios_gpu.gif b/mediapipe/docs/images/mobile/edge_detection_ios_gpu.gif
index 7417fcbc3..6d1a73060 100644
Binary files a/mediapipe/docs/images/mobile/edge_detection_ios_gpu.gif and b/mediapipe/docs/images/mobile/edge_detection_ios_gpu.gif differ
diff --git a/mediapipe/docs/images/mobile/edge_detection_mobile_gpu.png b/mediapipe/docs/images/mobile/edge_detection_mobile_gpu.png
new file mode 100644
index 000000000..a082ec1d0
Binary files /dev/null and b/mediapipe/docs/images/mobile/edge_detection_mobile_gpu.png differ
diff --git a/mediapipe/docs/images/mobile/edge_detection_mobile_graph_gpu.png b/mediapipe/docs/images/mobile/edge_detection_mobile_graph_gpu.png
deleted file mode 100644
index 0555c2d13..000000000
Binary files a/mediapipe/docs/images/mobile/edge_detection_mobile_graph_gpu.png and /dev/null differ
diff --git a/mediapipe/docs/images/mobile/face_detection_android_gpu.gif b/mediapipe/docs/images/mobile/face_detection_android_gpu.gif
index 28ae7d51c..983595e68 100644
Binary files a/mediapipe/docs/images/mobile/face_detection_android_gpu.gif and b/mediapipe/docs/images/mobile/face_detection_android_gpu.gif differ
diff --git a/mediapipe/docs/images/mobile/hair_segmentation_mobile_gpu.png b/mediapipe/docs/images/mobile/hair_segmentation_mobile_gpu.png
index 465046816..2a87ee834 100644
Binary files a/mediapipe/docs/images/mobile/hair_segmentation_mobile_gpu.png and b/mediapipe/docs/images/mobile/hair_segmentation_mobile_gpu.png differ
diff --git a/mediapipe/docs/images/mobile/hand_detection_gpu_subgraph.png b/mediapipe/docs/images/mobile/hand_detection_gpu_subgraph.png
index c3fbc2ee0..ba1fe9786 100644
Binary files a/mediapipe/docs/images/mobile/hand_detection_gpu_subgraph.png and b/mediapipe/docs/images/mobile/hand_detection_gpu_subgraph.png differ
diff --git a/mediapipe/docs/images/mobile/hand_tracking_mobile.png b/mediapipe/docs/images/mobile/hand_tracking_mobile.png
index 83850f507..66b9a7a9e 100644
Binary files a/mediapipe/docs/images/mobile/hand_tracking_mobile.png and b/mediapipe/docs/images/mobile/hand_tracking_mobile.png differ
diff --git a/mediapipe/docs/images/mobile/missing_camera_permission_android.png b/mediapipe/docs/images/mobile/missing_camera_permission_android.png
index 9e35aebaa..d492d56b8 100644
Binary files a/mediapipe/docs/images/mobile/missing_camera_permission_android.png and b/mediapipe/docs/images/mobile/missing_camera_permission_android.png differ
diff --git a/mediapipe/docs/images/mobile/object_detection_desktop_tensorflow.png b/mediapipe/docs/images/mobile/object_detection_desktop_tensorflow.png
deleted file mode 100644
index 50d7597f1..000000000
Binary files a/mediapipe/docs/images/mobile/object_detection_desktop_tensorflow.png and /dev/null differ
diff --git a/mediapipe/docs/images/mobile/object_detection_desktop_tflite.png b/mediapipe/docs/images/mobile/object_detection_desktop_tflite.png
deleted file mode 100644
index b66ff2c09..000000000
Binary files a/mediapipe/docs/images/mobile/object_detection_desktop_tflite.png and /dev/null differ
diff --git a/mediapipe/docs/images/object_detection_desktop_tensorflow.png b/mediapipe/docs/images/object_detection_desktop_tensorflow.png
index e1a363f16..50d7597f1 100644
Binary files a/mediapipe/docs/images/object_detection_desktop_tensorflow.png and b/mediapipe/docs/images/object_detection_desktop_tensorflow.png differ
diff --git a/mediapipe/docs/images/object_detection_desktop_tflite.png b/mediapipe/docs/images/object_detection_desktop_tflite.png
index f987f1db3..27963d13d 100644
Binary files a/mediapipe/docs/images/object_detection_desktop_tflite.png and b/mediapipe/docs/images/object_detection_desktop_tflite.png differ
diff --git a/mediapipe/docs/index.rst b/mediapipe/docs/index.rst
index 6b3556ff2..870d02b2b 100644
--- a/mediapipe/docs/index.rst
+++ b/mediapipe/docs/index.rst
@@ -8,9 +8,9 @@ machine learning pipeline can be built as a graph of modular components,
including, for instance, inference models and media processing functions. Sensory
data such as audio and video streams enter the graph, and perceived descriptions
such as object-localization and face-landmark streams exit the graph. An example
-graph that performs real-time hair segmentation on mobile GPU is shown below.
+graph that performs real-time hand tracking on mobile GPU is shown below.
-.. image:: images/mobile/hair_segmentation_android_gpu.png
+.. image:: images/mobile/hand_tracking_mobile.png
:width: 400
:alt: Example MediaPipe graph
@@ -29,11 +29,11 @@ APIs for MediaPipe
* (Coming Soon) Graph Construction API in C++
* Graph Execution API in C++
* Graph Execution API in Java (Android)
- * (Coming Soon) Graph Execution API in Objective-C (iOS)
+ * Graph Execution API in Objective-C (iOS)
Alpha Disclaimer
==================
-MediaPipe is currently in alpha for v0.5. We are still making breaking API changes and expect to get to stable API by v1.0. We recommend that you target a specific version of MediaPipe, and periodically bump to the latest release. That way you have control over when a breaking change affects you.
+MediaPipe is currently in alpha for v0.6. We are still making breaking API changes and expect to get to stable API by v1.0. We recommend that you target a specific version of MediaPipe, and periodically bump to the latest release. That way you have control over when a breaking change affects you.
User Documentation
==================
diff --git a/mediapipe/docs/object_detection_desktop.md b/mediapipe/docs/object_detection_desktop.md
index f69eab16e..88334993e 100644
--- a/mediapipe/docs/object_detection_desktop.md
+++ b/mediapipe/docs/object_detection_desktop.md
@@ -44,7 +44,7 @@ $ bazel-bin/mediapipe/examples/desktop/object_detection/object_detection_tensorf
#### Graph
-![graph visualization](images/object_detection_desktop_tensorflow.png){width="800"}
+![graph visualization](images/object_detection_desktop_tensorflow.png)
To visualize the graph as shown above, copy the text specification of the graph
below and paste it into
@@ -209,7 +209,7 @@ $ bazel-bin/mediapipe/examples/desktop/object_detection/object_detection_tflite
#### Graph
-![graph visualization](images/object_detection_desktop_tflite.png){width="400"}
+![graph visualization](images/object_detection_desktop_tflite.png)
To visualize the graph as shown above, copy the text specification of the graph
below and paste it into
diff --git a/mediapipe/docs/object_detection_mobile_cpu.md b/mediapipe/docs/object_detection_mobile_cpu.md
index 4ee0459c7..7f8d8ef23 100644
--- a/mediapipe/docs/object_detection_mobile_cpu.md
+++ b/mediapipe/docs/object_detection_mobile_cpu.md
@@ -1,8 +1,6 @@
# Object Detection (CPU)
-Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
-general instructions to develop an Android application that uses MediaPipe. This
-doc focuses on the
+This doc focuses on the
[example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/object_detection/object_detection_mobile_cpu.pbtxt)
that performs object detection with TensorFlow Lite on CPU.
@@ -11,22 +9,25 @@ This is very similar to the
except that at the beginning and the end of the graph it performs GPU-to-CPU and
CPU-to-GPU image transfer respectively. As a result, the rest of graph, which
shares the same configuration as the
-[GPU graph](images/mobile/object_detection_android_gpu.png), runs entirely on
+[GPU graph](images/mobile/object_detection_mobile_gpu.png), runs entirely on
CPU.
-![object_detection_android_cpu_gif](images/mobile/object_detection_android_cpu.gif){width="300"}
+![object_detection_android_cpu_gif](images/mobile/object_detection_android_cpu.gif)
## Android
-The graph is used in the
-[Object Detection CPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectioncpu)
-example app. To build the app, run:
+Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
+general instructions to develop an Android application that uses MediaPipe.
+
+The graph below is used in the
+[Object Detection CPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectioncpu).
+To build the app, run:
```bash
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectioncpu
```
-To further install the app on android device, run:
+To further install the app on an Android device, run:
```bash
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectioncpu/objectdetectioncpu.apk
@@ -35,13 +36,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
## iOS
Please see [Hello World! in MediaPipe on iOS](hello_world_ios.md) for general
-instructions to develop an iOS application that uses MediaPipe. The graph below
-is used in the
-[Object Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/objectdetectioncpu).
+instructions to develop an iOS application that uses MediaPipe.
-To build the iOS app, please see the general
+The graph below is used in the
+[Object Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/objectdetectioncpu).
+To build the app, please see the general
[MediaPipe iOS app building and setup instructions](./mediapipe_ios_setup.md).
-Specifically, run:
+Specific to this example, run:
```bash
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/objectdetectioncpu:ObjectDetectionCpuApp
@@ -49,11 +50,13 @@ bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/objectdetectioncpu:
## Graph
-![object_detection_mobile_cpu_graph](images/mobile/object_detection_mobile_cpu.png){width="400"}
+![object_detection_mobile_cpu_graph](images/mobile/object_detection_mobile_cpu.png)
To visualize the graph as shown above, copy the text specification of the graph
below and paste it into [MediaPipe Visualizer](https://viz.mediapipe.dev/).
+[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/object_detection/object_detection_mobile_cpu.pbtxt)
+
```bash
# MediaPipe graph that performs object detection with TensorFlow Lite on CPU.
# Used in the example in
diff --git a/mediapipe/docs/object_detection_mobile_gpu.md b/mediapipe/docs/object_detection_mobile_gpu.md
index 917f7aa3d..7423e2f1d 100644
--- a/mediapipe/docs/object_detection_mobile_gpu.md
+++ b/mediapipe/docs/object_detection_mobile_gpu.md
@@ -4,7 +4,7 @@ This doc focuses on the
[below example graph](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/object_detection/object_detection_mobile_gpu.pbtxt)
that performs object detection with TensorFlow Lite on GPU.
-![object_detection_android_gpu_gif](images/mobile/object_detection_android_gpu.gif){width="300"}
+![object_detection_android_gpu_gif](images/mobile/object_detection_android_gpu.gif)
## Android
@@ -12,14 +12,14 @@ Please see [Hello World! in MediaPipe on Android](hello_world_android.md) for
general instructions to develop an Android application that uses MediaPipe.
The graph below is used in the
-[Object Detection GPU](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectiongpu)
-example app. To build the app, run:
+[Object Detection GPU Android example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectiongpu).
+To build the app, run:
```bash
bazel build -c opt --config=android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectiongpu
```
-To further install the app on android device, run:
+To further install the app on an Android device, run:
```bash
adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetectiongpu/objectdetectiongpu.apk
@@ -28,13 +28,13 @@ adb install bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/a
## iOS
Please see [Hello World! in MediaPipe on iOS](hello_world_ios.md) for general
-instructions to develop an iOS application that uses MediaPipe. The graph below
-is used in the
-[Object Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/objectdetectiongpu)
+instructions to develop an iOS application that uses MediaPipe.
-To build the iOS app, please see the general
+The graph below is used in the
+[Object Detection GPU iOS example app](https://github.com/google/mediapipe/tree/master/mediapipe/examples/ios/objectdetectiongpu).
+To build the app, please see the general
[MediaPipe iOS app building and setup instructions](./mediapipe_ios_setup.md).
-Specifically, run:
+Specific to this example, run:
```bash
bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/objectdetectiongpu:ObjectDetectionGpuApp
@@ -42,11 +42,13 @@ bazel build -c opt --config=ios_arm64 mediapipe/examples/ios/objectdetectiongpu:
## Graph
-![object_detection_mobile_gpu_graph](images/mobile/object_detection_mobile_gpu.png){width="400"}
+![object_detection_mobile_gpu_graph](images/mobile/object_detection_mobile_gpu.png)
To visualize the graph as shown above, copy the text specification of the graph
below and paste it into [MediaPipe Visualizer](https://viz.mediapipe.dev/).
+[Source pbtxt file](https://github.com/google/mediapipe/tree/master/mediapipe/graphs/object_detection/object_detection_mobile_gpu.pbtxt)
+
```bash
# MediaPipe graph that performs object detection with TensorFlow Lite on GPU.
# Used in the example in
diff --git a/mediapipe/docs/visualizer.md b/mediapipe/docs/visualizer.md
index f4216b65f..275feb10f 100644
--- a/mediapipe/docs/visualizer.md
+++ b/mediapipe/docs/visualizer.md
@@ -79,4 +79,4 @@ and its associated [subgraph](./framework_concepts.md#subgraph) called
* Click on the subgraph block in purple `Hand Detection` and the
`hand_detection_gpu.pbtxt` tab will open
- ![Hand detection subgraph](./images/clicksubgraph_handdetection.png){width="1500"}
+ ![Hand detection subgraph](./images/click_subgraph_handdetection.png){width="1500"}
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/AndroidManifest.xml b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/AndroidManifest.xml
new file mode 100644
index 000000000..89d4480d6
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/AndroidManifest.xml
@@ -0,0 +1,33 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/BUILD b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/BUILD
new file mode 100644
index 000000000..439aed99c
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/BUILD
@@ -0,0 +1,83 @@
+# Copyright 2019 The MediaPipe Authors.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+licenses(["notice"]) # Apache 2.0
+
+package(default_visibility = ["//visibility:private"])
+
+cc_binary(
+ name = "libmediapipe_jni.so",
+ linkshared = 1,
+ linkstatic = 1,
+ deps = [
+ "//mediapipe/graphs/hand_tracking:detection_mobile_calculators",
+ "//mediapipe/java/com/google/mediapipe/framework/jni:mediapipe_framework_jni",
+ ],
+)
+
+cc_library(
+ name = "mediapipe_jni_lib",
+ srcs = [":libmediapipe_jni.so"],
+ alwayslink = 1,
+)
+
+# Maps the binary graph to an alias (e.g., the app name) for convenience so that the alias can be
+# easily incorporated into the app via, for example,
+# MainActivity.BINARY_GRAPH_NAME = "appname.binarypb".
+genrule(
+ name = "binary_graph",
+ srcs = ["//mediapipe/graphs/hand_tracking:hand_detection_mobile_gpu_binary_graph"],
+ outs = ["handdetectiongpu.binarypb"],
+ cmd = "cp $< $@",
+)
+
+android_library(
+ name = "mediapipe_lib",
+ srcs = glob(["*.java"]),
+ assets = [
+ ":binary_graph",
+ "//mediapipe/models:palm_detection.tflite",
+ "//mediapipe/models:palm_detection_labelmap.txt",
+ ],
+ assets_dir = "",
+ manifest = "AndroidManifest.xml",
+ resource_files = glob(["res/**"]),
+ deps = [
+ ":mediapipe_jni_lib",
+ "//mediapipe/java/com/google/mediapipe/components:android_camerax_helper",
+ "//mediapipe/java/com/google/mediapipe/components:android_components",
+ "//mediapipe/java/com/google/mediapipe/framework:android_framework",
+ "//mediapipe/java/com/google/mediapipe/glutil",
+ "//third_party:androidx_appcompat",
+ "//third_party:androidx_constraint_layout",
+ "//third_party:androidx_legacy_support_v4",
+ "//third_party:androidx_material",
+ "//third_party:androidx_recyclerview",
+ "//third_party:opencv",
+ "@androidx_concurrent_futures//jar",
+ "@androidx_lifecycle//jar",
+ "@com_google_code_findbugs//jar",
+ "@com_google_guava_android//jar",
+ ],
+)
+
+android_binary(
+ name = "handdetectiongpu",
+ manifest = "AndroidManifest.xml",
+ manifest_values = {"applicationId": "com.google.mediapipe.apps.handdetectiongpu"},
+ multidex = "native",
+ deps = [
+ ":mediapipe_lib",
+ ],
+)
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/MainActivity.java b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/MainActivity.java
new file mode 100644
index 000000000..9fe2b097e
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/MainActivity.java
@@ -0,0 +1,167 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package com.google.mediapipe.apps.handdetectiongpu;
+
+import android.graphics.SurfaceTexture;
+import android.os.Bundle;
+import androidx.appcompat.app.AppCompatActivity;
+import android.util.Size;
+import android.view.SurfaceHolder;
+import android.view.SurfaceView;
+import android.view.View;
+import android.view.ViewGroup;
+import com.google.mediapipe.components.CameraHelper;
+import com.google.mediapipe.components.CameraXPreviewHelper;
+import com.google.mediapipe.components.ExternalTextureConverter;
+import com.google.mediapipe.components.FrameProcessor;
+import com.google.mediapipe.components.PermissionHelper;
+import com.google.mediapipe.framework.AndroidAssetUtil;
+import com.google.mediapipe.glutil.EglManager;
+
+/** Main activity of MediaPipe example apps. */
+public class MainActivity extends AppCompatActivity {
+ private static final String TAG = "MainActivity";
+
+ private static final String BINARY_GRAPH_NAME = "handdetectiongpu.binarypb";
+ private static final String INPUT_VIDEO_STREAM_NAME = "input_video";
+ private static final String OUTPUT_VIDEO_STREAM_NAME = "output_video";
+ private static final CameraHelper.CameraFacing CAMERA_FACING = CameraHelper.CameraFacing.FRONT;
+
+ // Flips the camera-preview frames vertically before sending them into FrameProcessor to be
+ // processed in a MediaPipe graph, and flips the processed frames back when they are displayed.
+ // This is needed because OpenGL represents images assuming the image origin is at the bottom-left
+ // corner, whereas MediaPipe in general assumes the image origin is at top-left.
+ private static final boolean FLIP_FRAMES_VERTICALLY = true;
+
+ static {
+ // Load all native libraries needed by the app.
+ System.loadLibrary("mediapipe_jni");
+ System.loadLibrary("opencv_java4");
+ }
+
+ // {@link SurfaceTexture} where the camera-preview frames can be accessed.
+ private SurfaceTexture previewFrameTexture;
+ // {@link SurfaceView} that displays the camera-preview frames processed by a MediaPipe graph.
+ private SurfaceView previewDisplayView;
+
+ // Creates and manages an {@link EGLContext}.
+ private EglManager eglManager;
+ // Sends camera-preview frames into a MediaPipe graph for processing, and displays the processed
+ // frames onto a {@link Surface}.
+ private FrameProcessor processor;
+ // Converts the GL_TEXTURE_EXTERNAL_OES texture from Android camera into a regular texture to be
+ // consumed by {@link FrameProcessor} and the underlying MediaPipe graph.
+ private ExternalTextureConverter converter;
+
+ // Handles camera access via the {@link CameraX} Jetpack support library.
+ private CameraXPreviewHelper cameraHelper;
+
+ @Override
+ protected void onCreate(Bundle savedInstanceState) {
+ super.onCreate(savedInstanceState);
+ setContentView(R.layout.activity_main);
+
+ previewDisplayView = new SurfaceView(this);
+ setupPreviewDisplayView();
+
+ // Initilize asset manager so that MediaPipe native libraries can access the app assets, e.g.,
+ // binary graphs.
+ AndroidAssetUtil.initializeNativeAssetManager(this);
+
+ eglManager = new EglManager(null);
+ processor =
+ new FrameProcessor(
+ this,
+ eglManager.getNativeContext(),
+ BINARY_GRAPH_NAME,
+ INPUT_VIDEO_STREAM_NAME,
+ OUTPUT_VIDEO_STREAM_NAME);
+ processor.getVideoSurfaceOutput().setFlipY(FLIP_FRAMES_VERTICALLY);
+
+ PermissionHelper.checkAndRequestCameraPermissions(this);
+ }
+
+ @Override
+ protected void onResume() {
+ super.onResume();
+ converter = new ExternalTextureConverter(eglManager.getContext());
+ converter.setFlipY(FLIP_FRAMES_VERTICALLY);
+ converter.setConsumer(processor);
+ if (PermissionHelper.cameraPermissionsGranted(this)) {
+ startCamera();
+ }
+ }
+
+ @Override
+ protected void onPause() {
+ super.onPause();
+ converter.close();
+ }
+
+ @Override
+ public void onRequestPermissionsResult(
+ int requestCode, String[] permissions, int[] grantResults) {
+ super.onRequestPermissionsResult(requestCode, permissions, grantResults);
+ PermissionHelper.onRequestPermissionsResult(requestCode, permissions, grantResults);
+ }
+
+ private void setupPreviewDisplayView() {
+ previewDisplayView.setVisibility(View.GONE);
+ ViewGroup viewGroup = findViewById(R.id.preview_display_layout);
+ viewGroup.addView(previewDisplayView);
+
+ previewDisplayView
+ .getHolder()
+ .addCallback(
+ new SurfaceHolder.Callback() {
+ @Override
+ public void surfaceCreated(SurfaceHolder holder) {
+ processor.getVideoSurfaceOutput().setSurface(holder.getSurface());
+ }
+
+ @Override
+ public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
+ // (Re-)Compute the ideal size of the camera-preview display (the area that the
+ // camera-preview frames get rendered onto, potentially with scaling and rotation)
+ // based on the size of the SurfaceView that contains the display.
+ Size viewSize = new Size(width, height);
+ Size displaySize = cameraHelper.computeDisplaySizeFromViewSize(viewSize);
+
+ // Connect the converter to the camera-preview frames as its input (via
+ // previewFrameTexture), and configure the output width and height as the computed
+ // display size.
+ converter.setSurfaceTextureAndAttachToGLContext(
+ previewFrameTexture, displaySize.getWidth(), displaySize.getHeight());
+ }
+
+ @Override
+ public void surfaceDestroyed(SurfaceHolder holder) {
+ processor.getVideoSurfaceOutput().setSurface(null);
+ }
+ });
+ }
+
+ private void startCamera() {
+ cameraHelper = new CameraXPreviewHelper();
+ cameraHelper.setOnCameraStartedListener(
+ surfaceTexture -> {
+ previewFrameTexture = surfaceTexture;
+ // Make the display view visible to start showing the preview. This triggers the
+ // SurfaceHolder.Callback added to (the holder of) previewDisplayView.
+ previewDisplayView.setVisibility(View.VISIBLE);
+ });
+ cameraHelper.startCamera(this, CAMERA_FACING, /*surfaceTexture=*/ null);
+ }
+}
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/layout/activity_main.xml b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/layout/activity_main.xml
new file mode 100644
index 000000000..c19d7e628
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/layout/activity_main.xml
@@ -0,0 +1,20 @@
+
+
+
+
+
+
+
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/values/colors.xml b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/values/colors.xml
new file mode 100644
index 000000000..69b22338c
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/values/colors.xml
@@ -0,0 +1,6 @@
+
+
+ #008577
+ #00574B
+ #D81B60
+
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/values/strings.xml b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/values/strings.xml
new file mode 100644
index 000000000..35c39cfb0
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/values/strings.xml
@@ -0,0 +1,4 @@
+
+ Hand Detection GPU
+ Please grant camera permissions.
+
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/values/styles.xml b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/values/styles.xml
new file mode 100644
index 000000000..5885930df
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handdetectiongpu/res/values/styles.xml
@@ -0,0 +1,11 @@
+
+
+
+
+
+
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/AndroidManifest.xml b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/AndroidManifest.xml
new file mode 100644
index 000000000..b51aab7f7
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/AndroidManifest.xml
@@ -0,0 +1,33 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/BUILD b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/BUILD
new file mode 100644
index 000000000..9dd6b475d
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/BUILD
@@ -0,0 +1,103 @@
+# Copyright 2019 The MediaPipe Authors.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+licenses(["notice"]) # Apache 2.0
+
+package(default_visibility = ["//visibility:private"])
+
+cc_binary(
+ name = "libmediapipe_jni.so",
+ linkshared = 1,
+ linkstatic = 1,
+ deps = [
+ "//mediapipe/graphs/hand_tracking:mobile_calculators",
+ "//mediapipe/java/com/google/mediapipe/framework/jni:mediapipe_framework_jni",
+ ],
+)
+
+cc_library(
+ name = "mediapipe_jni_lib",
+ srcs = [":libmediapipe_jni.so"],
+ alwayslink = 1,
+)
+
+# Maps the binary graph to an alias (e.g., the app name) for convenience so that the alias can be
+# easily incorporated into the app via, for example,
+# MainActivity.BINARY_GRAPH_NAME = "appname.binarypb".
+genrule(
+ name = "binary_graph",
+ srcs = ["//mediapipe/graphs/hand_tracking:hand_tracking_mobile_gpu_binary_graph"],
+ outs = ["handtrackinggpu.binarypb"],
+ cmd = "cp $< $@",
+)
+
+# To use the 3D model instead of the default 2D model, add "--define 3D=true" to the
+# bazel build command.
+config_setting(
+ name = "use_3d_model",
+ define_values = {
+ "3D": "true",
+ },
+)
+
+genrule(
+ name = "model",
+ srcs = select({
+ "//conditions:default": ["//mediapipe/models:hand_landmark.tflite"],
+ ":use_3d_model": ["//mediapipe/models:hand_landmark_3d.tflite"],
+ }),
+ outs = ["hand_landmark.tflite"],
+ cmd = "cp $< $@",
+)
+
+android_library(
+ name = "mediapipe_lib",
+ srcs = glob(["*.java"]),
+ assets = [
+ ":binary_graph",
+ ":model",
+ "//mediapipe/models:palm_detection.tflite",
+ "//mediapipe/models:palm_detection_labelmap.txt",
+ ],
+ assets_dir = "",
+ manifest = "AndroidManifest.xml",
+ resource_files = glob(["res/**"]),
+ deps = [
+ ":mediapipe_jni_lib",
+ "//mediapipe/java/com/google/mediapipe/components:android_camerax_helper",
+ "//mediapipe/java/com/google/mediapipe/components:android_components",
+ "//mediapipe/java/com/google/mediapipe/framework:android_framework",
+ "//mediapipe/java/com/google/mediapipe/glutil",
+ "//third_party:androidx_appcompat",
+ "//third_party:androidx_constraint_layout",
+ "//third_party:androidx_legacy_support_v4",
+ "//third_party:androidx_material",
+ "//third_party:androidx_recyclerview",
+ "//third_party:opencv",
+ "@androidx_concurrent_futures//jar",
+ "@androidx_lifecycle//jar",
+ "@com_google_code_findbugs//jar",
+ "@com_google_guava_android//jar",
+ ],
+)
+
+android_binary(
+ name = "handtrackinggpu",
+ manifest = "AndroidManifest.xml",
+ manifest_values = {"applicationId": "com.google.mediapipe.apps.handtrackinggpu"},
+ multidex = "native",
+ deps = [
+ ":mediapipe_lib",
+ ],
+)
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/MainActivity.java b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/MainActivity.java
new file mode 100644
index 000000000..ea694b302
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/MainActivity.java
@@ -0,0 +1,167 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+package com.google.mediapipe.apps.handtrackinggpu;
+
+import android.graphics.SurfaceTexture;
+import android.os.Bundle;
+import androidx.appcompat.app.AppCompatActivity;
+import android.util.Size;
+import android.view.SurfaceHolder;
+import android.view.SurfaceView;
+import android.view.View;
+import android.view.ViewGroup;
+import com.google.mediapipe.components.CameraHelper;
+import com.google.mediapipe.components.CameraXPreviewHelper;
+import com.google.mediapipe.components.ExternalTextureConverter;
+import com.google.mediapipe.components.FrameProcessor;
+import com.google.mediapipe.components.PermissionHelper;
+import com.google.mediapipe.framework.AndroidAssetUtil;
+import com.google.mediapipe.glutil.EglManager;
+
+/** Main activity of MediaPipe example apps. */
+public class MainActivity extends AppCompatActivity {
+ private static final String TAG = "MainActivity";
+
+ private static final String BINARY_GRAPH_NAME = "handtrackinggpu.binarypb";
+ private static final String INPUT_VIDEO_STREAM_NAME = "input_video";
+ private static final String OUTPUT_VIDEO_STREAM_NAME = "output_video";
+ private static final CameraHelper.CameraFacing CAMERA_FACING = CameraHelper.CameraFacing.FRONT;
+
+ // Flips the camera-preview frames vertically before sending them into FrameProcessor to be
+ // processed in a MediaPipe graph, and flips the processed frames back when they are displayed.
+ // This is needed because OpenGL represents images assuming the image origin is at the bottom-left
+ // corner, whereas MediaPipe in general assumes the image origin is at top-left.
+ private static final boolean FLIP_FRAMES_VERTICALLY = true;
+
+ static {
+ // Load all native libraries needed by the app.
+ System.loadLibrary("mediapipe_jni");
+ System.loadLibrary("opencv_java4");
+ }
+
+ // {@link SurfaceTexture} where the camera-preview frames can be accessed.
+ private SurfaceTexture previewFrameTexture;
+ // {@link SurfaceView} that displays the camera-preview frames processed by a MediaPipe graph.
+ private SurfaceView previewDisplayView;
+
+ // Creates and manages an {@link EGLContext}.
+ private EglManager eglManager;
+ // Sends camera-preview frames into a MediaPipe graph for processing, and displays the processed
+ // frames onto a {@link Surface}.
+ private FrameProcessor processor;
+ // Converts the GL_TEXTURE_EXTERNAL_OES texture from Android camera into a regular texture to be
+ // consumed by {@link FrameProcessor} and the underlying MediaPipe graph.
+ private ExternalTextureConverter converter;
+
+ // Handles camera access via the {@link CameraX} Jetpack support library.
+ private CameraXPreviewHelper cameraHelper;
+
+ @Override
+ protected void onCreate(Bundle savedInstanceState) {
+ super.onCreate(savedInstanceState);
+ setContentView(R.layout.activity_main);
+
+ previewDisplayView = new SurfaceView(this);
+ setupPreviewDisplayView();
+
+ // Initilize asset manager so that MediaPipe native libraries can access the app assets, e.g.,
+ // binary graphs.
+ AndroidAssetUtil.initializeNativeAssetManager(this);
+
+ eglManager = new EglManager(null);
+ processor =
+ new FrameProcessor(
+ this,
+ eglManager.getNativeContext(),
+ BINARY_GRAPH_NAME,
+ INPUT_VIDEO_STREAM_NAME,
+ OUTPUT_VIDEO_STREAM_NAME);
+ processor.getVideoSurfaceOutput().setFlipY(FLIP_FRAMES_VERTICALLY);
+
+ PermissionHelper.checkAndRequestCameraPermissions(this);
+ }
+
+ @Override
+ protected void onResume() {
+ super.onResume();
+ converter = new ExternalTextureConverter(eglManager.getContext());
+ converter.setFlipY(FLIP_FRAMES_VERTICALLY);
+ converter.setConsumer(processor);
+ if (PermissionHelper.cameraPermissionsGranted(this)) {
+ startCamera();
+ }
+ }
+
+ @Override
+ protected void onPause() {
+ super.onPause();
+ converter.close();
+ }
+
+ @Override
+ public void onRequestPermissionsResult(
+ int requestCode, String[] permissions, int[] grantResults) {
+ super.onRequestPermissionsResult(requestCode, permissions, grantResults);
+ PermissionHelper.onRequestPermissionsResult(requestCode, permissions, grantResults);
+ }
+
+ private void setupPreviewDisplayView() {
+ previewDisplayView.setVisibility(View.GONE);
+ ViewGroup viewGroup = findViewById(R.id.preview_display_layout);
+ viewGroup.addView(previewDisplayView);
+
+ previewDisplayView
+ .getHolder()
+ .addCallback(
+ new SurfaceHolder.Callback() {
+ @Override
+ public void surfaceCreated(SurfaceHolder holder) {
+ processor.getVideoSurfaceOutput().setSurface(holder.getSurface());
+ }
+
+ @Override
+ public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
+ // (Re-)Compute the ideal size of the camera-preview display (the area that the
+ // camera-preview frames get rendered onto, potentially with scaling and rotation)
+ // based on the size of the SurfaceView that contains the display.
+ Size viewSize = new Size(width, height);
+ Size displaySize = cameraHelper.computeDisplaySizeFromViewSize(viewSize);
+
+ // Connect the converter to the camera-preview frames as its input (via
+ // previewFrameTexture), and configure the output width and height as the computed
+ // display size.
+ converter.setSurfaceTextureAndAttachToGLContext(
+ previewFrameTexture, displaySize.getWidth(), displaySize.getHeight());
+ }
+
+ @Override
+ public void surfaceDestroyed(SurfaceHolder holder) {
+ processor.getVideoSurfaceOutput().setSurface(null);
+ }
+ });
+ }
+
+ private void startCamera() {
+ cameraHelper = new CameraXPreviewHelper();
+ cameraHelper.setOnCameraStartedListener(
+ surfaceTexture -> {
+ previewFrameTexture = surfaceTexture;
+ // Make the display view visible to start showing the preview. This triggers the
+ // SurfaceHolder.Callback added to (the holder of) previewDisplayView.
+ previewDisplayView.setVisibility(View.VISIBLE);
+ });
+ cameraHelper.startCamera(this, CAMERA_FACING, /*surfaceTexture=*/ null);
+ }
+}
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/layout/activity_main.xml b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/layout/activity_main.xml
new file mode 100644
index 000000000..c19d7e628
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/layout/activity_main.xml
@@ -0,0 +1,20 @@
+
+
+
+
+
+
+
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/values/colors.xml b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/values/colors.xml
new file mode 100644
index 000000000..69b22338c
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/values/colors.xml
@@ -0,0 +1,6 @@
+
+
+ #008577
+ #00574B
+ #D81B60
+
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/values/strings.xml b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/values/strings.xml
new file mode 100644
index 000000000..f30d0965d
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/values/strings.xml
@@ -0,0 +1,4 @@
+
+ Hand Tracking GPU
+ Please grant camera permissions.
+
diff --git a/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/values/styles.xml b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/values/styles.xml
new file mode 100644
index 000000000..5885930df
--- /dev/null
+++ b/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu/res/values/styles.xml
@@ -0,0 +1,11 @@
+
+
+
+
+
+
diff --git a/mediapipe/examples/ios/handdetectiongpu/AppDelegate.h b/mediapipe/examples/ios/handdetectiongpu/AppDelegate.h
new file mode 100644
index 000000000..6b0377ef2
--- /dev/null
+++ b/mediapipe/examples/ios/handdetectiongpu/AppDelegate.h
@@ -0,0 +1,21 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+#import
+
+@interface AppDelegate : UIResponder
+
+@property(strong, nonatomic) UIWindow *window;
+
+@end
diff --git a/mediapipe/examples/ios/handdetectiongpu/AppDelegate.m b/mediapipe/examples/ios/handdetectiongpu/AppDelegate.m
new file mode 100644
index 000000000..9e1b7ff0e
--- /dev/null
+++ b/mediapipe/examples/ios/handdetectiongpu/AppDelegate.m
@@ -0,0 +1,59 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+#import "AppDelegate.h"
+
+@interface AppDelegate ()
+
+@end
+
+@implementation AppDelegate
+
+- (BOOL)application:(UIApplication *)application
+ didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
+ // Override point for customization after application launch.
+ return YES;
+}
+
+- (void)applicationWillResignActive:(UIApplication *)application {
+ // Sent when the application is about to move from active to inactive state. This can occur for
+ // certain types of temporary interruptions (such as an incoming phone call or SMS message) or
+ // when the user quits the application and it begins the transition to the background state. Use
+ // this method to pause ongoing tasks, disable timers, and invalidate graphics rendering
+ // callbacks. Games should use this method to pause the game.
+}
+
+- (void)applicationDidEnterBackground:(UIApplication *)application {
+ // Use this method to release shared resources, save user data, invalidate timers, and store
+ // enough application state information to restore your application to its current state in case
+ // it is terminated later. If your application supports background execution, this method is
+ // called instead of applicationWillTerminate: when the user quits.
+}
+
+- (void)applicationWillEnterForeground:(UIApplication *)application {
+ // Called as part of the transition from the background to the active state; here you can undo
+ // many of the changes made on entering the background.
+}
+
+- (void)applicationDidBecomeActive:(UIApplication *)application {
+ // Restart any tasks that were paused (or not yet started) while the application was inactive. If
+ // the application was previously in the background, optionally refresh the user interface.
+}
+
+- (void)applicationWillTerminate:(UIApplication *)application {
+ // Called when the application is about to terminate. Save data if appropriate. See also
+ // applicationDidEnterBackground:.
+}
+
+@end
diff --git a/mediapipe/examples/ios/handdetectiongpu/Assets.xcassets/AppIcon.appiconset/Contents.json b/mediapipe/examples/ios/handdetectiongpu/Assets.xcassets/AppIcon.appiconset/Contents.json
new file mode 100644
index 000000000..a1895a242
--- /dev/null
+++ b/mediapipe/examples/ios/handdetectiongpu/Assets.xcassets/AppIcon.appiconset/Contents.json
@@ -0,0 +1,99 @@
+{
+ "images" : [
+ {
+ "idiom" : "iphone",
+ "size" : "20x20",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "20x20",
+ "scale" : "3x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "29x29",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "29x29",
+ "scale" : "3x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "40x40",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "40x40",
+ "scale" : "3x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "60x60",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "60x60",
+ "scale" : "3x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "20x20",
+ "scale" : "1x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "20x20",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "29x29",
+ "scale" : "1x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "29x29",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "40x40",
+ "scale" : "1x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "40x40",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "76x76",
+ "scale" : "1x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "76x76",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "83.5x83.5",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "ios-marketing",
+ "size" : "1024x1024",
+ "scale" : "1x"
+ }
+ ],
+ "info" : {
+ "version" : 1,
+ "author" : "xcode"
+ }
+}
+
diff --git a/mediapipe/examples/ios/handdetectiongpu/Assets.xcassets/Contents.json b/mediapipe/examples/ios/handdetectiongpu/Assets.xcassets/Contents.json
new file mode 100644
index 000000000..7afcdfaf8
--- /dev/null
+++ b/mediapipe/examples/ios/handdetectiongpu/Assets.xcassets/Contents.json
@@ -0,0 +1,7 @@
+{
+ "info" : {
+ "version" : 1,
+ "author" : "xcode"
+ }
+}
+
diff --git a/mediapipe/examples/ios/handdetectiongpu/BUILD b/mediapipe/examples/ios/handdetectiongpu/BUILD
new file mode 100644
index 000000000..47f1f0ed5
--- /dev/null
+++ b/mediapipe/examples/ios/handdetectiongpu/BUILD
@@ -0,0 +1,75 @@
+# Copyright 2019 The MediaPipe Authors.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+licenses(["notice"]) # Apache 2.0
+
+MIN_IOS_VERSION = "10.0"
+
+load(
+ "@build_bazel_rules_apple//apple:ios.bzl",
+ "ios_application",
+)
+
+ios_application(
+ name = "HandDetectionGpuApp",
+ bundle_id = "com.google.mediapipe.HandDetectionGpu",
+ families = [
+ "iphone",
+ "ipad",
+ ],
+ infoplists = ["Info.plist"],
+ minimum_os_version = MIN_IOS_VERSION,
+ provisioning_profile = "//mediapipe/examples/ios:provisioning_profile",
+ deps = [
+ ":HandDetectionGpuAppLibrary",
+ "@ios_opencv//:OpencvFramework",
+ ],
+)
+
+objc_library(
+ name = "HandDetectionGpuAppLibrary",
+ srcs = [
+ "AppDelegate.m",
+ "ViewController.mm",
+ "main.m",
+ ],
+ hdrs = [
+ "AppDelegate.h",
+ "ViewController.h",
+ ],
+ data = [
+ "Base.lproj/LaunchScreen.storyboard",
+ "Base.lproj/Main.storyboard",
+ "//mediapipe/graphs/hand_tracking:hand_detection_mobile_gpu_binary_graph",
+ "//mediapipe/models:palm_detection.tflite",
+ "//mediapipe/models:palm_detection_labelmap.txt",
+ ],
+ sdk_frameworks = [
+ "AVFoundation",
+ "CoreGraphics",
+ "CoreMedia",
+ "UIKit",
+ ],
+ deps = [
+ "//mediapipe/objc:mediapipe_framework_ios",
+ "//mediapipe/objc:mediapipe_input_sources_ios",
+ "//mediapipe/objc:mediapipe_layer_renderer",
+ ] + select({
+ "//mediapipe:ios_i386": [],
+ "//mediapipe:ios_x86_64": [],
+ "//conditions:default": [
+ "//mediapipe/graphs/hand_tracking:detection_mobile_calculators",
+ ],
+ }),
+)
diff --git a/mediapipe/examples/ios/handdetectiongpu/Base.lproj/LaunchScreen.storyboard b/mediapipe/examples/ios/handdetectiongpu/Base.lproj/LaunchScreen.storyboard
new file mode 100644
index 000000000..bfa361294
--- /dev/null
+++ b/mediapipe/examples/ios/handdetectiongpu/Base.lproj/LaunchScreen.storyboard
@@ -0,0 +1,25 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/mediapipe/examples/ios/handdetectiongpu/Base.lproj/Main.storyboard b/mediapipe/examples/ios/handdetectiongpu/Base.lproj/Main.storyboard
new file mode 100644
index 000000000..76dcb7823
--- /dev/null
+++ b/mediapipe/examples/ios/handdetectiongpu/Base.lproj/Main.storyboard
@@ -0,0 +1,41 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/mediapipe/examples/ios/handdetectiongpu/Info.plist b/mediapipe/examples/ios/handdetectiongpu/Info.plist
new file mode 100644
index 000000000..30db14c62
--- /dev/null
+++ b/mediapipe/examples/ios/handdetectiongpu/Info.plist
@@ -0,0 +1,42 @@
+
+
+
+
+ NSCameraUsageDescription
+ This app uses the camera to demonstrate live video processing.
+ CFBundleDevelopmentRegion
+ en
+ CFBundleExecutable
+ $(EXECUTABLE_NAME)
+ CFBundleIdentifier
+ $(PRODUCT_BUNDLE_IDENTIFIER)
+ CFBundleInfoDictionaryVersion
+ 6.0
+ CFBundleName
+ $(PRODUCT_NAME)
+ CFBundlePackageType
+ APPL
+ CFBundleShortVersionString
+ 1.0
+ CFBundleVersion
+ 1
+ LSRequiresIPhoneOS
+
+ UILaunchStoryboardName
+ LaunchScreen
+ UIMainStoryboardFile
+ Main
+ UIRequiredDeviceCapabilities
+
+ armv7
+
+ UISupportedInterfaceOrientations
+
+ UIInterfaceOrientationPortrait
+
+ UISupportedInterfaceOrientations~ipad
+
+ UIInterfaceOrientationPortrait
+
+
+
diff --git a/mediapipe/examples/ios/handdetectiongpu/ViewController.h b/mediapipe/examples/ios/handdetectiongpu/ViewController.h
new file mode 100644
index 000000000..e0a5a6367
--- /dev/null
+++ b/mediapipe/examples/ios/handdetectiongpu/ViewController.h
@@ -0,0 +1,19 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+#import
+
+@interface ViewController : UIViewController
+
+@end
diff --git a/mediapipe/examples/ios/handdetectiongpu/ViewController.mm b/mediapipe/examples/ios/handdetectiongpu/ViewController.mm
new file mode 100644
index 000000000..6ea25e7c2
--- /dev/null
+++ b/mediapipe/examples/ios/handdetectiongpu/ViewController.mm
@@ -0,0 +1,178 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+#import "ViewController.h"
+
+#import "mediapipe/objc/MPPGraph.h"
+#import "mediapipe/objc/MPPCameraInputSource.h"
+#import "mediapipe/objc/MPPLayerRenderer.h"
+
+static NSString* const kGraphName = @"hand_detection_mobile_gpu";
+
+static const char* kInputStream = "input_video";
+static const char* kOutputStream = "output_video";
+static const char* kVideoQueueLabel = "com.google.mediapipe.example.videoQueue";
+
+@interface ViewController ()
+
+// The MediaPipe graph currently in use. Initialized in viewDidLoad, started in viewWillAppear: and
+// sent video frames on _videoQueue.
+@property(nonatomic) MPPGraph* mediapipeGraph;
+
+@end
+
+@implementation ViewController {
+ /// Handles camera access via AVCaptureSession library.
+ MPPCameraInputSource* _cameraSource;
+
+ /// Inform the user when camera is unavailable.
+ IBOutlet UILabel* _noCameraLabel;
+ /// Display the camera preview frames.
+ IBOutlet UIView* _liveView;
+ /// Render frames in a layer.
+ MPPLayerRenderer* _renderer;
+
+ /// Process camera frames on this queue.
+ dispatch_queue_t _videoQueue;
+}
+
+#pragma mark - Cleanup methods
+
+- (void)dealloc {
+ self.mediapipeGraph.delegate = nil;
+ [self.mediapipeGraph cancel];
+ // Ignore errors since we're cleaning up.
+ [self.mediapipeGraph closeAllInputStreamsWithError:nil];
+ [self.mediapipeGraph waitUntilDoneWithError:nil];
+}
+
+#pragma mark - MediaPipe graph methods
+
++ (MPPGraph*)loadGraphFromResource:(NSString*)resource {
+ // Load the graph config resource.
+ NSError* configLoadError = nil;
+ NSBundle* bundle = [NSBundle bundleForClass:[self class]];
+ if (!resource || resource.length == 0) {
+ return nil;
+ }
+ NSURL* graphURL = [bundle URLForResource:resource withExtension:@"binarypb"];
+ NSData* data = [NSData dataWithContentsOfURL:graphURL options:0 error:&configLoadError];
+ if (!data) {
+ NSLog(@"Failed to load MediaPipe graph config: %@", configLoadError);
+ return nil;
+ }
+
+ // Parse the graph config resource into mediapipe::CalculatorGraphConfig proto object.
+ mediapipe::CalculatorGraphConfig config;
+ config.ParseFromArray(data.bytes, data.length);
+
+ // Create MediaPipe graph with mediapipe::CalculatorGraphConfig proto object.
+ MPPGraph* newGraph = [[MPPGraph alloc] initWithGraphConfig:config];
+ [newGraph addFrameOutputStream:kOutputStream outputPacketType:MediaPipePacketPixelBuffer];
+ return newGraph;
+}
+
+#pragma mark - UIViewController methods
+
+- (void)viewDidLoad {
+ [super viewDidLoad];
+
+ _renderer = [[MPPLayerRenderer alloc] init];
+ _renderer.layer.frame = _liveView.layer.bounds;
+ [_liveView.layer addSublayer:_renderer.layer];
+ _renderer.frameScaleMode = MediaPipeFrameScaleFillAndCrop;
+ // When using the front camera, mirror the input for a more natural look.
+ _renderer.mirrored = YES;
+
+ dispatch_queue_attr_t qosAttribute = dispatch_queue_attr_make_with_qos_class(
+ DISPATCH_QUEUE_SERIAL, QOS_CLASS_USER_INTERACTIVE, /*relative_priority=*/0);
+ _videoQueue = dispatch_queue_create(kVideoQueueLabel, qosAttribute);
+
+ _cameraSource = [[MPPCameraInputSource alloc] init];
+ [_cameraSource setDelegate:self queue:_videoQueue];
+ _cameraSource.sessionPreset = AVCaptureSessionPresetHigh;
+ _cameraSource.cameraPosition = AVCaptureDevicePositionFront;
+ // The frame's native format is rotated with respect to the portrait orientation.
+ _cameraSource.orientation = AVCaptureVideoOrientationPortrait;
+
+ self.mediapipeGraph = [[self class] loadGraphFromResource:kGraphName];
+ self.mediapipeGraph.delegate = self;
+ // Set maxFramesInFlight to a small value to avoid memory contention for real-time processing.
+ self.mediapipeGraph.maxFramesInFlight = 2;
+}
+
+// In this application, there is only one ViewController which has no navigation to other view
+// controllers, and there is only one View with live display showing the result of running the
+// MediaPipe graph on the live video feed. If more view controllers are needed later, the graph
+// setup/teardown and camera start/stop logic should be updated appropriately in response to the
+// appearance/disappearance of this ViewController, as viewWillAppear: can be invoked multiple times
+// depending on the application navigation flow in that case.
+- (void)viewWillAppear:(BOOL)animated {
+ [super viewWillAppear:animated];
+
+ [_cameraSource requestCameraAccessWithCompletionHandler:^void(BOOL granted) {
+ if (granted) {
+ [self startGraphAndCamera];
+ dispatch_async(dispatch_get_main_queue(), ^{
+ _noCameraLabel.hidden = YES;
+ });
+ }
+ }];
+}
+
+- (void)startGraphAndCamera {
+ // Start running self.mediapipeGraph.
+ NSError* error;
+ if (![self.mediapipeGraph startWithError:&error]) {
+ NSLog(@"Failed to start graph: %@", error);
+ }
+
+ // Start fetching frames from the camera.
+ dispatch_async(_videoQueue, ^{
+ [_cameraSource start];
+ });
+}
+
+#pragma mark - MPPGraphDelegate methods
+
+// Receives CVPixelBufferRef from the MediaPipe graph. Invoked on a MediaPipe worker thread.
+- (void)mediapipeGraph:(MPPGraph*)graph
+ didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
+ fromStream:(const std::string&)streamName {
+ if (streamName == kOutputStream) {
+ // Display the captured image on the screen.
+ CVPixelBufferRetain(pixelBuffer);
+ dispatch_async(dispatch_get_main_queue(), ^{
+ [_renderer renderPixelBuffer:pixelBuffer];
+ CVPixelBufferRelease(pixelBuffer);
+ });
+ }
+}
+
+#pragma mark - MPPInputSourceDelegate methods
+
+// Must be invoked on _videoQueue.
+- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
+ timestamp:(CMTime)timestamp
+ fromSource:(MPPInputSource*)source {
+ if (source != _cameraSource) {
+ NSLog(@"Unknown source: %@", source);
+ return;
+ }
+ [self.mediapipeGraph sendPixelBuffer:imageBuffer
+ intoStream:kInputStream
+ packetType:MediaPipePacketPixelBuffer];
+}
+
+@end
diff --git a/mediapipe/examples/ios/handdetectiongpu/main.m b/mediapipe/examples/ios/handdetectiongpu/main.m
new file mode 100644
index 000000000..7ffe5ea5d
--- /dev/null
+++ b/mediapipe/examples/ios/handdetectiongpu/main.m
@@ -0,0 +1,22 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+#import
+#import "AppDelegate.h"
+
+int main(int argc, char * argv[]) {
+ @autoreleasepool {
+ return UIApplicationMain(argc, argv, nil, NSStringFromClass([AppDelegate class]));
+ }
+}
diff --git a/mediapipe/examples/ios/handtrackinggpu/AppDelegate.h b/mediapipe/examples/ios/handtrackinggpu/AppDelegate.h
new file mode 100644
index 000000000..6b0377ef2
--- /dev/null
+++ b/mediapipe/examples/ios/handtrackinggpu/AppDelegate.h
@@ -0,0 +1,21 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+#import
+
+@interface AppDelegate : UIResponder
+
+@property(strong, nonatomic) UIWindow *window;
+
+@end
diff --git a/mediapipe/examples/ios/handtrackinggpu/AppDelegate.m b/mediapipe/examples/ios/handtrackinggpu/AppDelegate.m
new file mode 100644
index 000000000..9e1b7ff0e
--- /dev/null
+++ b/mediapipe/examples/ios/handtrackinggpu/AppDelegate.m
@@ -0,0 +1,59 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+#import "AppDelegate.h"
+
+@interface AppDelegate ()
+
+@end
+
+@implementation AppDelegate
+
+- (BOOL)application:(UIApplication *)application
+ didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
+ // Override point for customization after application launch.
+ return YES;
+}
+
+- (void)applicationWillResignActive:(UIApplication *)application {
+ // Sent when the application is about to move from active to inactive state. This can occur for
+ // certain types of temporary interruptions (such as an incoming phone call or SMS message) or
+ // when the user quits the application and it begins the transition to the background state. Use
+ // this method to pause ongoing tasks, disable timers, and invalidate graphics rendering
+ // callbacks. Games should use this method to pause the game.
+}
+
+- (void)applicationDidEnterBackground:(UIApplication *)application {
+ // Use this method to release shared resources, save user data, invalidate timers, and store
+ // enough application state information to restore your application to its current state in case
+ // it is terminated later. If your application supports background execution, this method is
+ // called instead of applicationWillTerminate: when the user quits.
+}
+
+- (void)applicationWillEnterForeground:(UIApplication *)application {
+ // Called as part of the transition from the background to the active state; here you can undo
+ // many of the changes made on entering the background.
+}
+
+- (void)applicationDidBecomeActive:(UIApplication *)application {
+ // Restart any tasks that were paused (or not yet started) while the application was inactive. If
+ // the application was previously in the background, optionally refresh the user interface.
+}
+
+- (void)applicationWillTerminate:(UIApplication *)application {
+ // Called when the application is about to terminate. Save data if appropriate. See also
+ // applicationDidEnterBackground:.
+}
+
+@end
diff --git a/mediapipe/examples/ios/handtrackinggpu/Assets.xcassets/AppIcon.appiconset/Contents.json b/mediapipe/examples/ios/handtrackinggpu/Assets.xcassets/AppIcon.appiconset/Contents.json
new file mode 100644
index 000000000..a1895a242
--- /dev/null
+++ b/mediapipe/examples/ios/handtrackinggpu/Assets.xcassets/AppIcon.appiconset/Contents.json
@@ -0,0 +1,99 @@
+{
+ "images" : [
+ {
+ "idiom" : "iphone",
+ "size" : "20x20",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "20x20",
+ "scale" : "3x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "29x29",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "29x29",
+ "scale" : "3x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "40x40",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "40x40",
+ "scale" : "3x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "60x60",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "iphone",
+ "size" : "60x60",
+ "scale" : "3x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "20x20",
+ "scale" : "1x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "20x20",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "29x29",
+ "scale" : "1x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "29x29",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "40x40",
+ "scale" : "1x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "40x40",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "76x76",
+ "scale" : "1x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "76x76",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "ipad",
+ "size" : "83.5x83.5",
+ "scale" : "2x"
+ },
+ {
+ "idiom" : "ios-marketing",
+ "size" : "1024x1024",
+ "scale" : "1x"
+ }
+ ],
+ "info" : {
+ "version" : 1,
+ "author" : "xcode"
+ }
+}
+
diff --git a/mediapipe/examples/ios/handtrackinggpu/Assets.xcassets/Contents.json b/mediapipe/examples/ios/handtrackinggpu/Assets.xcassets/Contents.json
new file mode 100644
index 000000000..7afcdfaf8
--- /dev/null
+++ b/mediapipe/examples/ios/handtrackinggpu/Assets.xcassets/Contents.json
@@ -0,0 +1,7 @@
+{
+ "info" : {
+ "version" : 1,
+ "author" : "xcode"
+ }
+}
+
diff --git a/mediapipe/examples/ios/handtrackinggpu/BUILD b/mediapipe/examples/ios/handtrackinggpu/BUILD
new file mode 100644
index 000000000..f84008fc1
--- /dev/null
+++ b/mediapipe/examples/ios/handtrackinggpu/BUILD
@@ -0,0 +1,95 @@
+# Copyright 2019 The MediaPipe Authors.
+#
+# Licensed under the Apache License, Version 2.0 (the "License");
+# you may not use this file except in compliance with the License.
+# You may obtain a copy of the License at
+#
+# http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+
+licenses(["notice"]) # Apache 2.0
+
+MIN_IOS_VERSION = "10.0"
+
+load(
+ "@build_bazel_rules_apple//apple:ios.bzl",
+ "ios_application",
+)
+
+# To use the 3D model instead of the default 2D model, add "--define 3D=true" to the
+# bazel build command.
+config_setting(
+ name = "use_3d_model",
+ define_values = {
+ "3D": "true",
+ },
+)
+
+genrule(
+ name = "model",
+ srcs = select({
+ "//conditions:default": ["//mediapipe/models:hand_landmark.tflite"],
+ ":use_3d_model": ["//mediapipe/models:hand_landmark_3d.tflite"],
+ }),
+ outs = ["hand_landmark.tflite"],
+ cmd = "cp $< $@",
+)
+
+ios_application(
+ name = "HandTrackingGpuApp",
+ bundle_id = "com.google.mediapipe.HandTrackingGpu",
+ families = [
+ "iphone",
+ "ipad",
+ ],
+ infoplists = ["Info.plist"],
+ minimum_os_version = MIN_IOS_VERSION,
+ provisioning_profile = "//mediapipe/examples/ios:provisioning_profile",
+ deps = [
+ ":HandTrackingGpuAppLibrary",
+ "@ios_opencv//:OpencvFramework",
+ ],
+)
+
+objc_library(
+ name = "HandTrackingGpuAppLibrary",
+ srcs = [
+ "AppDelegate.m",
+ "ViewController.mm",
+ "main.m",
+ ],
+ hdrs = [
+ "AppDelegate.h",
+ "ViewController.h",
+ ],
+ data = [
+ "Base.lproj/LaunchScreen.storyboard",
+ "Base.lproj/Main.storyboard",
+ ":model",
+ "//mediapipe/graphs/hand_tracking:hand_tracking_mobile_gpu_binary_graph",
+ "//mediapipe/models:palm_detection.tflite",
+ "//mediapipe/models:palm_detection_labelmap.txt",
+ ],
+ sdk_frameworks = [
+ "AVFoundation",
+ "CoreGraphics",
+ "CoreMedia",
+ "UIKit",
+ ],
+ deps = [
+ "//mediapipe/objc:mediapipe_framework_ios",
+ "//mediapipe/objc:mediapipe_input_sources_ios",
+ "//mediapipe/objc:mediapipe_layer_renderer",
+ ] + select({
+ "//mediapipe:ios_i386": [],
+ "//mediapipe:ios_x86_64": [],
+ "//conditions:default": [
+ "//mediapipe/graphs/hand_tracking:mobile_calculators",
+ ],
+ }),
+)
diff --git a/mediapipe/examples/ios/handtrackinggpu/Base.lproj/LaunchScreen.storyboard b/mediapipe/examples/ios/handtrackinggpu/Base.lproj/LaunchScreen.storyboard
new file mode 100644
index 000000000..bfa361294
--- /dev/null
+++ b/mediapipe/examples/ios/handtrackinggpu/Base.lproj/LaunchScreen.storyboard
@@ -0,0 +1,25 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/mediapipe/examples/ios/handtrackinggpu/Base.lproj/Main.storyboard b/mediapipe/examples/ios/handtrackinggpu/Base.lproj/Main.storyboard
new file mode 100644
index 000000000..76dcb7823
--- /dev/null
+++ b/mediapipe/examples/ios/handtrackinggpu/Base.lproj/Main.storyboard
@@ -0,0 +1,41 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/mediapipe/examples/ios/handtrackinggpu/Info.plist b/mediapipe/examples/ios/handtrackinggpu/Info.plist
new file mode 100644
index 000000000..30db14c62
--- /dev/null
+++ b/mediapipe/examples/ios/handtrackinggpu/Info.plist
@@ -0,0 +1,42 @@
+
+
+
+
+ NSCameraUsageDescription
+ This app uses the camera to demonstrate live video processing.
+ CFBundleDevelopmentRegion
+ en
+ CFBundleExecutable
+ $(EXECUTABLE_NAME)
+ CFBundleIdentifier
+ $(PRODUCT_BUNDLE_IDENTIFIER)
+ CFBundleInfoDictionaryVersion
+ 6.0
+ CFBundleName
+ $(PRODUCT_NAME)
+ CFBundlePackageType
+ APPL
+ CFBundleShortVersionString
+ 1.0
+ CFBundleVersion
+ 1
+ LSRequiresIPhoneOS
+
+ UILaunchStoryboardName
+ LaunchScreen
+ UIMainStoryboardFile
+ Main
+ UIRequiredDeviceCapabilities
+
+ armv7
+
+ UISupportedInterfaceOrientations
+
+ UIInterfaceOrientationPortrait
+
+ UISupportedInterfaceOrientations~ipad
+
+ UIInterfaceOrientationPortrait
+
+
+
diff --git a/mediapipe/examples/ios/handtrackinggpu/ViewController.h b/mediapipe/examples/ios/handtrackinggpu/ViewController.h
new file mode 100644
index 000000000..e0a5a6367
--- /dev/null
+++ b/mediapipe/examples/ios/handtrackinggpu/ViewController.h
@@ -0,0 +1,19 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+#import
+
+@interface ViewController : UIViewController
+
+@end
diff --git a/mediapipe/examples/ios/handtrackinggpu/ViewController.mm b/mediapipe/examples/ios/handtrackinggpu/ViewController.mm
new file mode 100644
index 000000000..bde56843d
--- /dev/null
+++ b/mediapipe/examples/ios/handtrackinggpu/ViewController.mm
@@ -0,0 +1,178 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+#import "ViewController.h"
+
+#import "mediapipe/objc/MPPGraph.h"
+#import "mediapipe/objc/MPPCameraInputSource.h"
+#import "mediapipe/objc/MPPLayerRenderer.h"
+
+static NSString* const kGraphName = @"hand_tracking_mobile_gpu";
+
+static const char* kInputStream = "input_video";
+static const char* kOutputStream = "output_video";
+static const char* kVideoQueueLabel = "com.google.mediapipe.example.videoQueue";
+
+@interface ViewController ()
+
+// The MediaPipe graph currently in use. Initialized in viewDidLoad, started in viewWillAppear: and
+// sent video frames on _videoQueue.
+@property(nonatomic) MPPGraph* mediapipeGraph;
+
+@end
+
+@implementation ViewController {
+ /// Handles camera access via AVCaptureSession library.
+ MPPCameraInputSource* _cameraSource;
+
+ /// Inform the user when camera is unavailable.
+ IBOutlet UILabel* _noCameraLabel;
+ /// Display the camera preview frames.
+ IBOutlet UIView* _liveView;
+ /// Render frames in a layer.
+ MPPLayerRenderer* _renderer;
+
+ /// Process camera frames on this queue.
+ dispatch_queue_t _videoQueue;
+}
+
+#pragma mark - Cleanup methods
+
+- (void)dealloc {
+ self.mediapipeGraph.delegate = nil;
+ [self.mediapipeGraph cancel];
+ // Ignore errors since we're cleaning up.
+ [self.mediapipeGraph closeAllInputStreamsWithError:nil];
+ [self.mediapipeGraph waitUntilDoneWithError:nil];
+}
+
+#pragma mark - MediaPipe graph methods
+
++ (MPPGraph*)loadGraphFromResource:(NSString*)resource {
+ // Load the graph config resource.
+ NSError* configLoadError = nil;
+ NSBundle* bundle = [NSBundle bundleForClass:[self class]];
+ if (!resource || resource.length == 0) {
+ return nil;
+ }
+ NSURL* graphURL = [bundle URLForResource:resource withExtension:@"binarypb"];
+ NSData* data = [NSData dataWithContentsOfURL:graphURL options:0 error:&configLoadError];
+ if (!data) {
+ NSLog(@"Failed to load MediaPipe graph config: %@", configLoadError);
+ return nil;
+ }
+
+ // Parse the graph config resource into mediapipe::CalculatorGraphConfig proto object.
+ mediapipe::CalculatorGraphConfig config;
+ config.ParseFromArray(data.bytes, data.length);
+
+ // Create MediaPipe graph with mediapipe::CalculatorGraphConfig proto object.
+ MPPGraph* newGraph = [[MPPGraph alloc] initWithGraphConfig:config];
+ [newGraph addFrameOutputStream:kOutputStream outputPacketType:MediaPipePacketPixelBuffer];
+ return newGraph;
+}
+
+#pragma mark - UIViewController methods
+
+- (void)viewDidLoad {
+ [super viewDidLoad];
+
+ _renderer = [[MPPLayerRenderer alloc] init];
+ _renderer.layer.frame = _liveView.layer.bounds;
+ [_liveView.layer addSublayer:_renderer.layer];
+ _renderer.frameScaleMode = MediaPipeFrameScaleFillAndCrop;
+ // When using the front camera, mirror the input for a more natural look.
+ _renderer.mirrored = YES;
+
+ dispatch_queue_attr_t qosAttribute = dispatch_queue_attr_make_with_qos_class(
+ DISPATCH_QUEUE_SERIAL, QOS_CLASS_USER_INTERACTIVE, /*relative_priority=*/0);
+ _videoQueue = dispatch_queue_create(kVideoQueueLabel, qosAttribute);
+
+ _cameraSource = [[MPPCameraInputSource alloc] init];
+ [_cameraSource setDelegate:self queue:_videoQueue];
+ _cameraSource.sessionPreset = AVCaptureSessionPresetHigh;
+ _cameraSource.cameraPosition = AVCaptureDevicePositionFront;
+ // The frame's native format is rotated with respect to the portrait orientation.
+ _cameraSource.orientation = AVCaptureVideoOrientationPortrait;
+
+ self.mediapipeGraph = [[self class] loadGraphFromResource:kGraphName];
+ self.mediapipeGraph.delegate = self;
+ // Set maxFramesInFlight to a small value to avoid memory contention for real-time processing.
+ self.mediapipeGraph.maxFramesInFlight = 2;
+}
+
+// In this application, there is only one ViewController which has no navigation to other view
+// controllers, and there is only one View with live display showing the result of running the
+// MediaPipe graph on the live video feed. If more view controllers are needed later, the graph
+// setup/teardown and camera start/stop logic should be updated appropriately in response to the
+// appearance/disappearance of this ViewController, as viewWillAppear: can be invoked multiple times
+// depending on the application navigation flow in that case.
+- (void)viewWillAppear:(BOOL)animated {
+ [super viewWillAppear:animated];
+
+ [_cameraSource requestCameraAccessWithCompletionHandler:^void(BOOL granted) {
+ if (granted) {
+ [self startGraphAndCamera];
+ dispatch_async(dispatch_get_main_queue(), ^{
+ _noCameraLabel.hidden = YES;
+ });
+ }
+ }];
+}
+
+- (void)startGraphAndCamera {
+ // Start running self.mediapipeGraph.
+ NSError* error;
+ if (![self.mediapipeGraph startWithError:&error]) {
+ NSLog(@"Failed to start graph: %@", error);
+ }
+
+ // Start fetching frames from the camera.
+ dispatch_async(_videoQueue, ^{
+ [_cameraSource start];
+ });
+}
+
+#pragma mark - MPPGraphDelegate methods
+
+// Receives CVPixelBufferRef from the MediaPipe graph. Invoked on a MediaPipe worker thread.
+- (void)mediapipeGraph:(MPPGraph*)graph
+ didOutputPixelBuffer:(CVPixelBufferRef)pixelBuffer
+ fromStream:(const std::string&)streamName {
+ if (streamName == kOutputStream) {
+ // Display the captured image on the screen.
+ CVPixelBufferRetain(pixelBuffer);
+ dispatch_async(dispatch_get_main_queue(), ^{
+ [_renderer renderPixelBuffer:pixelBuffer];
+ CVPixelBufferRelease(pixelBuffer);
+ });
+ }
+}
+
+#pragma mark - MPPInputSourceDelegate methods
+
+// Must be invoked on _videoQueue.
+- (void)processVideoFrame:(CVPixelBufferRef)imageBuffer
+ timestamp:(CMTime)timestamp
+ fromSource:(MPPInputSource*)source {
+ if (source != _cameraSource) {
+ NSLog(@"Unknown source: %@", source);
+ return;
+ }
+ [self.mediapipeGraph sendPixelBuffer:imageBuffer
+ intoStream:kInputStream
+ packetType:MediaPipePacketPixelBuffer];
+}
+
+@end
diff --git a/mediapipe/examples/ios/handtrackinggpu/main.m b/mediapipe/examples/ios/handtrackinggpu/main.m
new file mode 100644
index 000000000..7ffe5ea5d
--- /dev/null
+++ b/mediapipe/examples/ios/handtrackinggpu/main.m
@@ -0,0 +1,22 @@
+// Copyright 2019 The MediaPipe Authors.
+//
+// Licensed under the Apache License, Version 2.0 (the "License");
+// you may not use this file except in compliance with the License.
+// You may obtain a copy of the License at
+//
+// http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing, software
+// distributed under the License is distributed on an "AS IS" BASIS,
+// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+// See the License for the specific language governing permissions and
+// limitations under the License.
+
+#import
+#import "AppDelegate.h"
+
+int main(int argc, char * argv[]) {
+ @autoreleasepool {
+ return UIApplicationMain(argc, argv, nil, NSStringFromClass([AppDelegate class]));
+ }
+}
diff --git a/mediapipe/models/hand_landmark.tflite b/mediapipe/models/hand_landmark.tflite
new file mode 100644
index 000000000..a17d9c247
Binary files /dev/null and b/mediapipe/models/hand_landmark.tflite differ
diff --git a/mediapipe/models/hand_landmark_3d.tflite b/mediapipe/models/hand_landmark_3d.tflite
new file mode 100644
index 000000000..8ca1c3d0a
Binary files /dev/null and b/mediapipe/models/hand_landmark_3d.tflite differ
diff --git a/mediapipe/models/palm_detection.tflite b/mediapipe/models/palm_detection.tflite
new file mode 100644
index 000000000..94c984cbf
Binary files /dev/null and b/mediapipe/models/palm_detection.tflite differ
diff --git a/mediapipe/models/palm_detection_labelmap.txt b/mediapipe/models/palm_detection_labelmap.txt
new file mode 100644
index 000000000..f3bf607d7
--- /dev/null
+++ b/mediapipe/models/palm_detection_labelmap.txt
@@ -0,0 +1 @@
+Palm