-
Notifications
You must be signed in to change notification settings - Fork 499
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLM offsets logic consolidate w/ checks and test case fix #1422
Commits on Oct 26, 2024
-
Fix for automatic GitHub workflow reruns (pytorch#1419)
Summary: Slight bug in the retry GitHub workflow causes failing workflows to rerun indefinitely. This _should_ fix it Differential Revision: D65008676
Configuration menu - View commit details
-
Copy full SHA for fd688bb - Browse repository at this point
Copy the full SHA fd688bbView commit details -
Add __call__ to TokenizerLike (pytorch#1418)
Summary: Add __call__ to TokenizerLike for transformers compatibility Differential Revision: D64998805
Configuration menu - View commit details
-
Copy full SHA for eb3eca2 - Browse repository at this point
Copy the full SHA eb3eca2View commit details -
Improve tokenizer pretty-pretty logic + __call__ method (pytorch#1417)
Summary: Use the __call__ method of tokenizers that returns a BatchEncoding with offsets. This allows us to grab text from the fully decoded string and not make assumptions about how many tokens correspond to a single string. Differential Revision: D64998804
Configuration menu - View commit details
-
Copy full SHA for b4fe485 - Browse repository at this point
Copy the full SHA b4fe485View commit details -
Fix mypy issue in visualization.py (pytorch#1416)
Summary: visualize_image_attr_multiple can return a List[Axes], adds proper annotations to satisfy mypy Why casting is necessary: numpy/numpy#24738 https://github.com/matplotlib/matplotlib/blob/v3.9.2/lib/matplotlib/pyplot.py#L1583C41-L1584C1 Differential Revision: D64998799
Configuration menu - View commit details
-
Copy full SHA for aea894e - Browse repository at this point
Copy the full SHA aea894eView commit details -
Remove mypy note from infidelity.py (pytorch#1415)
Summary: Adds enough typing to get rid of `captum/metrics/_core/infidelity.py:498: note: By default the bodies of untyped functions are not checked, consider using --check-untyped-defs [annotation-unchecked]` Differential Revision: D64998800
Configuration menu - View commit details
-
Copy full SHA for 7b944f9 - Browse repository at this point
Copy the full SHA 7b944f9View commit details -
Fix remaining pyre errors in infidelity.py (pytorch#1414)
Summary: Fix pyre/mypy errors in infidelity.py. Introduce new BaselineTupleType Differential Revision: D64998803
Configuration menu - View commit details
-
Copy full SHA for 59b7e81 - Browse repository at this point
Copy the full SHA 59b7e81View commit details -
Fix kwargs annotation in feature_ablation.py (pytorch#1421)
Summary: Fix incorrect **kwargs annotation Differential Revision: D65001879
Configuration menu - View commit details
-
Copy full SHA for a92c1b8 - Browse repository at this point
Copy the full SHA a92c1b8View commit details -
Consolidate LLM attr logic (pytorch#1420)
Summary: Add base class BaseLLMAttribution to consolidate repeat logic between perturbation/gradient-based LLM attr classes Differential Revision: D65008854
Configuration menu - View commit details
-
Copy full SHA for 5b60545 - Browse repository at this point
Copy the full SHA 5b60545View commit details -
LLM offsets logic consolidate w/ checks and test case fix (pytorch#1422)
Summary: Consolidate offsets logic with extra checks to one function. May be used to later group data in gradient LLM attribution. Test case fixed as a result of checks. Differential Revision: D65010820
Configuration menu - View commit details
-
Copy full SHA for e232eb1 - Browse repository at this point
Copy the full SHA e232eb1View commit details