-
-
Notifications
You must be signed in to change notification settings - Fork 403
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added inspect_points datashader operation #4794
Conversation
I might prefer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great so far!
Can you maybe make the default marker shape an |
I think this PR will probably need some of the fixes planned in #4792 |
5555326
to
a2f6d8e
Compare
One comment Philipp made in #4796 (now closed) is that this operation should handle RGBs as well as RGBAs. I suggest raising an exception unless we want to declare a specific color as a null value. |
Is it essential to detect a null value, or just helpful? If it's not essential, I'd just print a helpful warning and then just skip checking for nulls, unless that makes it significantly more complex. |
It is not essential but if you can't detect null you can't optimize when inspecting sparse data. A warning seems fine. |
To make sure this isn't forgotten, I realized one bug we have: the null test should be over the mask area and not just a point test (i.e use the We can also have an optimization where we pick the first sample we find falling within a single pixel (assuming anything after that doesn't matter). |
Other things I would like to attempt before merging:
Other than that (and any unaddressed items above), I am very happy with this new operation! At a later date we can consider Edit: Glancing at the code the plain dataframe support looks like it might already be there. The idea of a warning remains though... |
@jbednar @philippjfr Ready for review. Hoping for some feedback on the API (e.g parameter names, the mask operation, arrangement of the transformers. Once things settle, I'll add unit tests. Happy to add docs in this PR but can also open a new PR if we want the code merged sooner. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will be very useful. I find the names transformer
, hits_transformer
, and points_transformer
confusing (and verbose), though. E.g. doesn't transformer
also transform hits, just as much as hits_transformer
does? Maybematch_transform
, matches_to_points
, matches_to_hits
?
It would be nice if this operation can be passed Points as an argument instead of being specialized for points, but I can't see how that could work.
Will it be able to e.g. select an entire connected trajectory, given one point on that trajectory?
holoviews/operation/datashader.py
Outdated
Maximum number of points to display within the mask of size | ||
pixels. Points are prioritized by distance from the cursor | ||
point. This means that the default value of one shows the single | ||
closest sample to the cursor.""") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Again, I'm not sure that's well defined; seems like all datapoints falling into the cursor's pixel are equivalent, and they don't need to be the closest (e.g. to the cursor pixel's center) to be appropriate; I'd just take any of them as there's no particular order that can be established on them.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd just take any of them as there's no particular order that can be established on them.
That is a possible optimization and not true in all cases.
If the datashading is occurring at screen resolution and bokeh's hover/tap is at the same resolution then it is true (well, the above is perfectly accurate but limited by the precision supplied by Bokeh), but it is very easy to ask rasterize
or datashade
to output rasters where that is not true at all (i.e bins much bigger than screen pixels by specifying custom x_sampling
and y_sampling
).
As currently defined, this inspector will work correctly for all datashaded output and this docstring is correct. For a lot of common datashader output where things are configured to automatically make datashaded bins match screen pixels, then your suggested change would work because Bokeh isn't giving anything past screen pixel precision (and even this might not be the case, for all I know a mouse pointer can specify an x,y position that is finer than a pixel! What happens if you stick your screen into 640x480 resolution?).
Because of all this, I think the docstring is well defined and correct even if the precision with which the cursor's position is specified is limited.
I've addressed quite a few of your suggestions, thanks! The remaining items need a bit more discussion though.
No. This operation is explicitly
Happy to use these though I don't find them particularly compelling either. Maybe |
Could then provide a generic |
I like this but how would |
The dask pinning is now in #4803 and we can decide what to do there. Tests on this PR will be failing unless we reintroduce the pin, fix the underlying issue (e.g update the tests?) or rebase after that PR is merged to master. |
Regarding naming, here are some generic (so that we have support
|
Co-authored-by: James A. Bednar <[email protected]>
420e5da
to
102732f
Compare
@jbednar @philippjfr All tests are passing and I am happy with the API. There are a few things left to do but I think we can merge now and cut a dev release:
|
Going ahead and merging. |
This pull request has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. |
WIP.
Here is the example code
Getting tap behavior is as easy as adding
streams=[hv.streams.Tap]
to the operation:TODO