Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

📢 Call for Use Cases: We Want to Hear from You! 🛠 #113

Open
nikitakaraevv opened this issue Oct 17, 2024 · 7 comments
Open

📢 Call for Use Cases: We Want to Hear from You! 🛠 #113

nikitakaraevv opened this issue Oct 17, 2024 · 7 comments

Comments

@nikitakaraevv
Copy link
Contributor

Hey everyone! 👋

First off, a big shoutout to all of you for being part of the CoTracker community. Thanks for the support and for using the project in all the awesome ways you do! 🙌

We’d love to get a better idea of how you’re using CoTracker in your work. Whether it's powering your AI research, building applications, or something else entirely — we’re super curious to know! 🚀 And if there’s anything you think could be improved or added, we’re all ears.

You can drop a comment right here or feel free to reach out to us directly via email at [email protected] if that’s easier. We’d love to hear from you either way!

Hearing about your experiences helps us figure out what’s working well and what might need some tweaking. So if you’ve got a minute, it’d be awesome to hear your thoughts!

Thanks again for being a part of this journey! We’re excited to see what you’re all up to. 🌟

@Khoa-NT
Copy link

Khoa-NT commented Oct 18, 2024

Thank you for starting this discussion!
I tried out the HF demo with the bear example and noticed that it didn’t track the feet as expected. I’m curious if this is a particularly challenging case for point tracking? I’m not trying to compare models, just trying to understand what might be happening.

image

@dinrao
Copy link

dinrao commented Oct 23, 2024

Hi, I am keenly following the development of these point trackers. My use case is tracking the flight/movment of insects in natural enviroments, usually in cluttered background with lots of occlusion. generally I need atleast 2 points on the insect but the ability to track many points opens up a wealth of information about animal movements. I have tried the demo verison, but since I'm lacking in technical skills (took me a while to figure out how to change the path to the file!), I still havent been able to run it on my own videos. Your documentation is good, but it's still very complicated to follow for someone like me coming from a biology background.

@nikitakaraevv
Copy link
Contributor Author

Hi @dinrao, thank you for sharing your use case! Did you try the huggingface demo as well? it should be relatively straightforward to upload your own video there: https://huggingface.co/spaces/facebook/cotracker
What do you want to achieve by tracking insects?

@dinrao
Copy link

dinrao commented Oct 23, 2024

Yes I tried the demo, it works well, but I have longer videos. Here's an example of how tracking helps in animal behaviour. We manually tracked wasps approaching spiders on flowers to see if they can detect them from a distance (see supplementary videos for an example).

@JackIRose
Copy link

I would like to know what the format of your training set is, and how I can convert my own video into the same format as your training set through annotation or other methods. Can you give me a sample or tutorial? Thank you very much.

@JackIRose
Copy link

I would like to know what the format of your training set is, and how I can convert my own video into the same format as your training set through annotation or other methods. Can you give me a sample or tutorial? Thank you very much.
I have a video of mouse tracking. I want to track the nose and ears of the mouse, but the performance of the original model is not very good, so I want to know how to construct a training set for fine-tuning.

@Jafonso-sudo
Copy link

Jafonso-sudo commented Nov 11, 2024

Hi! I’m working on using CoTracker for 6D pose estimation of objects and am experimenting with ways to improve robustness. My approach involves adding an initialization video that shows the object from multiple angles, then querying points from each view to help the model better recognize the object.

However, I've encountered a significant issue: CoTracker's feature extraction doesn’t seem to be 2D rotation-invariant. When the object appears upright in the query frame but is upside down in an inference frame, the model struggles to recognize it as the same object. For example, in the first image below (query frame), the object is upright. In later images (later in the video), as we rotate the object to 180 degrees, the model performance starts decreasing. However, as we rotate it again 180 degrees, the model will start recovering as it realizes it's still looking at the same object from the query image.
In the second image (later in the video), where the object is rotated 90 degrees, the model’s performance decreases noticeably. In the third image (later in the video), it’s upside down, and CoTracker struggles to recognize it as the same object. By the fourth image (even later in the video), where the object is rotated 90 degrees again, the model’s performance improves noticeably. Finally, in the last image, the object is almost returned to the original position and CoTracker recovers almost fully.

My question: Is there an easy way to use multiple query frames per point? Specifically, can I link different views of the same point across frames? This way, I could provide rotated views of the original query frame, giving the model multiple examples of the same point when the object undergoes significant rotation.

If there’s no built-in way to achieve this, do you have any suggestions for how I might integrate this feature into CoTracker’s architecture without needing to retrain the model? I have been trying and struggling.

Thanks in advance! I understand this isn’t the primary use case for the model, but I’m hopeful that adding rotation handling could enhance its performance.

Query 90 degrees 180 degrees 270 degrees ~315 degrees
image image image image image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants