Project import generated by Copybara.

GitOrigin-RevId: 1a0caa03bbf3673dbe772c8045b687c6b6821bcc
This commit is contained in:
MediaPipe Team 2019-10-25 14:47:47 -07:00 committed by jqtang
parent 259b48e082
commit c6fea4c9d9

View File

@ -59,62 +59,62 @@
1. Download the YT8M dataset 1. Download the YT8M dataset
For example, download one shard of the training data: For example, download one shard of the training data:
```bash ```bash
curl http://us.data.yt8m.org/2/frame/train/trainpj.tfrecord --output /tmp/mediapipe/trainpj.tfrecord curl http://us.data.yt8m.org/2/frame/train/trainpj.tfrecord --output /tmp/mediapipe/trainpj.tfrecord
``` ```
2. Copy the baseline model [(model card)](https://drive.google.com/file/d/1xTCi9-Nm9dt2KIk8WR0dDFrIssWawyXy/view) to local. 2. Copy the baseline model [(model card)](https://drive.google.com/file/d/1xTCi9-Nm9dt2KIk8WR0dDFrIssWawyXy/view) to local.
```bash ```bash
curl -o /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz data.yt8m.org/models/baseline/saved_model.tar.gz curl -o /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz data.yt8m.org/models/baseline/saved_model.tar.gz
tar -xvf /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz -C /tmp/mediapipe tar -xvf /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz -C /tmp/mediapipe
``` ```
3. Build and run the inference binary. 3. Build and run the inference binary.
```bash ```bash
bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \ bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \
mediapipe/examples/desktop/youtube8m:model_inference mediapipe/examples/desktop/youtube8m:model_inference
GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/youtube8m/model_inference \ GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/youtube8m/model_inference \
--calculator_graph_config_file=mediapipe/graphs/youtube8m/yt8m_dataset_model_inference.pbtxt \ --calculator_graph_config_file=mediapipe/graphs/youtube8m/yt8m_dataset_model_inference.pbtxt \
--input_side_packets=tfrecord_path=/tmp/mediapipe/trainpj.tfrecord,record_index=0,desired_segment_size=5 \ --input_side_packets=tfrecord_path=/tmp/mediapipe/trainpj.tfrecord,record_index=0,desired_segment_size=5 \
--output_stream=annotation_summary \ --output_stream=annotation_summary \
--output_stream_file=/tmp/summary \ --output_stream_file=/tmp/summary \
--output_side_packets=yt8m_id \ --output_side_packets=yt8m_id \
--output_side_packets_file=/tmp/yt8m_id --output_side_packets_file=/tmp/yt8m_id
``` ```
### Steps to run the YouTube-8M model inference graph with Web Interface ### Steps to run the YouTube-8M model inference graph with Web Interface
1. Copy the baseline model [(model card)](https://drive.google.com/file/d/1xTCi9-Nm9dt2KIk8WR0dDFrIssWawyXy/view) to local. 1. Copy the baseline model [(model card)](https://drive.google.com/file/d/1xTCi9-Nm9dt2KIk8WR0dDFrIssWawyXy/view) to local.
```bash ```bash
curl -o /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz data.yt8m.org/models/baseline/saved_model.tar.gz curl -o /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz data.yt8m.org/models/baseline/saved_model.tar.gz
tar -xvf /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz -C /tmp/mediapipe tar -xvf /tmp/mediapipe/yt8m_baseline_saved_model.tar.gz -C /tmp/mediapipe
``` ```
2. Build the inference binary. 2. Build the inference binary.
```bash ```bash
bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \ bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \
mediapipe/examples/desktop/youtube8m:model_inference mediapipe/examples/desktop/youtube8m:model_inference
``` ```
3. Run the python web server. 3. Run the python web server.
Note: pip install absl-py Note: pip install absl-py
```bash ```bash
python mediapipe/examples/desktop/youtube8m/viewer/server.py --root `pwd` python mediapipe/examples/desktop/youtube8m/viewer/server.py --root `pwd`
``` ```
Navigate to localhost:8008 in a web browser. Navigate to localhost:8008 in a web browser.
### Steps to run the YouTube-8M model inference graph with a local video ### Steps to run the YouTube-8M model inference graph with a local video
@ -130,15 +130,15 @@
3. Build and run the inference binary. 3. Build and run the inference binary.
```bash ```bash
bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \ bazel build -c opt --define='MEDIAPIPE_DISABLE_GPU=1' \
mediapipe/examples/desktop/youtube8m:model_inference mediapipe/examples/desktop/youtube8m:model_inference
# segment_size is the number of seconds window of frames. # segment_size is the number of seconds window of frames.
# overlap is the number of seconds adjacent segments share. # overlap is the number of seconds adjacent segments share.
GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/youtube8m/model_inference \ GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/youtube8m/model_inference \
--calculator_graph_config_file=mediapipe/graphs/youtube8m/local_video_model_inference.pbtxt \ --calculator_graph_config_file=mediapipe/graphs/youtube8m/local_video_model_inference.pbtxt \
--input_side_packets=input_sequence_example_path=/tmp/mediapipe/output.tfrecord,input_video_path=/absolute/path/to/the/local/video/file,output_video_path=/tmp/mediapipe/annotated_video.mp4,segment_size=5,overlap=4 --input_side_packets=input_sequence_example_path=/tmp/mediapipe/output.tfrecord,input_video_path=/absolute/path/to/the/local/video/file,output_video_path=/tmp/mediapipe/annotated_video.mp4,segment_size=5,overlap=4
``` ```
4. View the annotated video. 4. View the annotated video.