-
-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible to exclude fields by default? #303
Comments
The example you wrote looks possible by overriding the
and overriding
(or just passing the
|
Would it be possible to extend the concept more generically? We could add an extra parameter, named for example "lazy_load" in the serializer we are using, where fields marked as such would only be loaded when explicitly specified in the query parameters, allowing for on-demand data retrieval. I love this project, and we are currently using it in our production software. However, I believe that this feature would help sometimes minimize data overfetching for our endpoints while reusing existing serializers. The example at the beginning of the thread perfectly illustrates the requirement. Keeping in mind that we are working with a REST architectural style, if we want to GET information about users, the ideal approach would be to bring only the information associated with the user initially. If we eventually need more information, we can explicitly ask for it with this excellent tool |
Is it possible to exclude fields by default? I have a serializer with a serialized field with a lot of data. This data will be used in some specific cases, but not always.
Imagine a user with 500 books. In my logic I normally don't need to know information about those 500 books, but I do need to know information about the user (this is not a real example).
I could exclude using the query
GET /user/1?query={username, -books}
but it forces me to put it everywhere where it is consumed.The idea would be something like:
Default:
With books field:
Thank you for everything!
The text was updated successfully, but these errors were encountered: