You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Because it's not using actual JSON encoding, it's possible to cause errors when strings with various special characters appear in the non-data fields passed to WriteBulkBytes. Some of these should be rejected (or handled gracefully elastic/elasticsearch#9059) by elasticsearch anyway, but they don't even get that far because it sends invalid JSON.
This can be fixed by actually JSON marshaling the strings, or the entire bulk operation command (which is what I ended up doing for now), along with the data payload. I assume though that this was being avoided in the first place because of a performance penalty?
The text was updated successfully, but these errors were encountered:
Because it's not using actual JSON encoding, it's possible to cause errors when strings with various special characters appear in the non-data fields passed to WriteBulkBytes. Some of these should be rejected (or handled gracefully elastic/elasticsearch#9059) by elasticsearch anyway, but they don't even get that far because it sends invalid JSON.
This can be fixed by actually JSON marshaling the strings, or the entire bulk operation command (which is what I ended up doing for now), along with the data payload. I assume though that this was being avoided in the first place because of a performance penalty?
The text was updated successfully, but these errors were encountered: