Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP][Spot] Spot's person following demo #1326

Closed

Conversation

sktometometo
Copy link
Contributor

@sktometometo sktometometo commented Mar 4, 2021

This demo enables Spot to follow person.

PXL_20210329_060658938.mp4

Prerequities

This demo requires packages below.

How to run

Before running this demo, please launch and prepair a controller.

  • jsk_spot_bringup.launch
  • object_detection_and_tracking.launch
  • multi_object_detector.launch

And then, please run

roslaunch spot_person_follower demo.launch

After this, you can start following behavior by pressing L2 button of the controller.
Spot will follow the nearest person at the time of pressing.
If you want to stop the behavior, please press the L2 button again.

@k-okada
Copy link
Member

k-okada commented Mar 22, 2021

@sktometometo
where is the node to publish /spot_recognition/bbox_array, /spot_recognition/tracking_labels ?

Cc: @tongtybj

@tongtybj
Copy link

@sktometometo

Please add the node of human tracking in your launch file, and I strongly recommend to clean up you commits.
Comments like "fix bugs", "update" are trivival. Please specify what kind of bugs or updates you address.
Also please add description for this PR.

BTW, I am very interesting with this demo. Can I try today or tomorrow?

@sktometometo
Copy link
Contributor Author

@k-okada @tongtybj
This demo requires outputs of rect_array_in_panorama_to_bounding_box_array node in object_detection_and_tracking.launch and deep_sort_tracker_node.py in multi_object_tracker.launch. Currently, both launch files are in jsk_spot_startup packgae.

Sorry for dirty commit history, this is a develop branch.
Now this demo works. So I will clean up commits and add descriptions.

I will go to the lab in today's evening.

@tongtybj
Copy link

@sktometometo

I also plan to go to lab today.
Can you show this demo at that time?

@sktometometo
Copy link
Contributor Author

@tongtybj
Yes, I can show you current demo. No problem.

@k-okada
Copy link
Member

k-okada commented Mar 22, 2021

@sktometometo do you have a bag file for this demo? @tongtybj is creating tracking node and he will create sample launch files that publish 3D person / human / cara trajectory.

@sktometometo
Copy link
Contributor Author

I think rosbag files in this grive directory can be used.

In order to reproduce recognition fuction of Spot, you need 2 ROS workspaces because these rosbag files only contains outputs of sensor data and currently object detection is done by coral_usb_ros
First one can be created with rosinstall file in my PR ( README.md file describes how to create a workspace for spot).
Second one have to include coral_usb_ros, jsk_robot with my PR, jsk_perception with my PR about rect_array_in_panorama_to_bounding_box_array.

In order to play, please run three launch files.

# In the first workspace
roslaunch jsk_spot_startup play.launch rosbag:=<absolute path to rosbag>
# In the second workspace
roslaunch jsk_spot_startup object_detection_and_tracking.launch
# In the first workspace
roslaunch jsk_spot_startup multi_object_tracker.launch

The third launch file requires CUDA and chainer <= 6.7.0

@tongtybj
Copy link

@sktometometo

Thank you for your kind instruction.

So, from my understanding,
object detection (from coral_usb_ros) is used to provide the first bounding box to feed and start deep_sorted_tracker, right?
And, why you need two ros worksapce, it that coral_sub_ros is based on python3, right?

I am developing a new tracker: https://github.com/tongtybj/detr, and I am going to try this new tracker with your rosbag data first.
Hope my new tracker can outperform the deep_sort_tracker.

@sktometometo
Copy link
Contributor Author

@tongtybj
rect_array_in_panorama_to_bounding_box_array and deep_sort_tracker runs independently since deep_sort_tracker use only object detection’s result
in 2D image plane.
Integration of multi object tracking and 3D geometric information is done in this demo.

@sktometometo
Copy link
Contributor Author

and yes, coral_usb_ros requires python3 and we needs to create 2 ROS workspaces

@k-okada
Copy link
Member

k-okada commented Mar 22, 2021

@sktometometo I think you can create bag file that includes images and /edgetpu_human_pose_estimator/output/poses, so that @tongtybj can skip workspace settings.

@sktometometo
Copy link
Contributor Author

@k-okada Ok I wilv creat it later

@sktometometo sktometometo changed the title [WIP] spot's person following demo Spot's person following demo Mar 29, 2021
@sktometometo sktometometo marked this pull request as ready for review March 29, 2021 06:59
@sktometometo sktometometo changed the title Spot's person following demo [WIP][Spot] Spot's person following demo Aug 20, 2021
@sktometometo sktometometo deleted the feature/spot/follow_person branch August 22, 2021 01:01
@sktometometo
Copy link
Contributor Author

Moved to #1343

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants