You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
My editor will frequently lock up for a very long period of time (from ~10s to minutes) when making changes to a source file, with high CPU usage in the language server.
I'm going to walk through my investigation of the problem. I'm not super familiar with all the languages and software involved, so I may have missed out on some useful debugging tools.
The first thing I did was try to use dotnet-trace to track down what was going on in the language server. There was one clearly active thread, but unfortunately the output seems to have a limit on call-stack depth (every stack on this thread has the exact same depth), so the bottom of the stack is missing:
Neither speedscope nor chromium could deal with these incoherent stacks, so I gave up on this approach.
Next I decided to capture the input to the language server so I could replay it. I started the language server and then entered a bunch of characters in a large file. This resulted in the locking behaviour and high CPU. I had to split the input into two chunks and delay the second part until after initialisation. Piping it to the language server also reproduced the high CPU usage.
The part where it locked up seemed to be a bunch of consecutive completion operations that were being started and cancelled:
I started adding stderr prints to omnisharp-roslyn and csharp-language-server-protocol to try to figure out how it reads and responds to request.
I found that in InputHandler the request dispatch seemed to take a long time for the completion operations:
var requestHandle = _requestInvoker.InvokeRequest(descriptor, item.Request);
This was blocking the thread from reading ahead in stdin, including (critically) the cancellation requests. It was acting as if the request was synchronous.
I found OmniSharpCompletionHandler, which asynchronously handles the request, and started adding prints to figure out which thread things were running on. I found that at least the bulk of the completion operation was running on the same thread as InputHandler, which explained the blocking.
I then found the place where the async operation gets run under Rx, which is in DefaultRequestInvoker:
Observable.FromAsync(
async ct => {
[...]
I tried passing various scheduler instances to FromAsync, but I ran into a problem where the operations would no longer complete. This should be reproducible simply with:
I still don't really understand this. It seemed to have something to do with Amb and how the async operations are started. For some reason, rearranging the Amb (to have FromAsync first) fixed it:
diff --git a/src/JsonRpc/DefaultRequestInvoker.cs b/src/JsonRpc/DefaultRequestInvoker.cs
index f4657dc9..62255d34 100644
--- a/src/JsonRpc/DefaultRequestInvoker.cs
+++ b/src/JsonRpc/DefaultRequestInvoker.cs
@@ -79,19 +79,6 @@ public override void Dispose()
observer => {
// ITS A RACE!
var sub = Observable.Amb(
- contentModifiedToken.Select(
- _ => {
- _logger.LogTrace(
- "Request {Id} was abandoned due to content be modified", request.Id
- );
- return new ErrorResponse(
- new ContentModified(request.Id, request.Method)
- );
- }
- ),
- Observable.Timer(_options.RequestTimeout, scheduler).Select(
- _ => new ErrorResponse(new RequestCancelled(request.Id, request.Method))
- ),
Observable.FromAsync(
async ct => {
using var timer = _logger.TimeDebug(
@@ -133,6 +120,22 @@ public override void Dispose()
);
return new InternalError(request.Id, request.Method, e.ToString());
}
+ },
+ ThreadPoolScheduler.Instance
+ ),
+ contentModifiedToken.Select(
+ _ => {
+ _logger.LogTrace(
+ "Request {Id} was abandoned due to content be modified", request.Id
+ );
+ return new ErrorResponse(
+ new ContentModified(request.Id, request.Method)
+ );
+ }
+ ),
+ Observable.Timer(_options.RequestTimeout, scheduler).Select(
+ _ => {
+ return new ErrorResponse(new RequestCancelled(request.Id, request.Method));
}
)
)
With these changes, there was much less blocking in InputHandler, so the editor experience was better.
I'd still occasionally get operations running on the same thread as InputHandler, so I tried adding this hack:
diff --git a/src/JsonRpc/DefaultRequestInvoker.cs b/src/JsonRpc/DefaultRequestInvoker.cs
index f825b4c7..acdf3851 100644
--- a/src/JsonRpc/DefaultRequestInvoker.cs
+++ b/src/JsonRpc/DefaultRequestInvoker.cs
@@ -105,6 +105,7 @@ public override void Dispose()
var sub = Observable.Amb(
Observable.FromAsync(
async ct => {
+ await Task.Yield();
using var timer = _logger.TimeDebug(
"Processing request {Method} {ResponseId}", request.Method,
This does seem to help, but it feels wrong, so I thought it was time for me to check in and see if anyone has any thoughts on all of this.
P.S. I did also solve some problems with cancellation of certain types of requests (e.g. completion), but that's a subject for an omnisharp-roslyn issue.
The text was updated successfully, but these errors were encountered:
I've been using omnisharp-roslyn with lsp-mode, on a pretty big project:
My editor will frequently lock up for a very long period of time (from ~10s to minutes) when making changes to a source file, with high CPU usage in the language server.
I'm going to walk through my investigation of the problem. I'm not super familiar with all the languages and software involved, so I may have missed out on some useful debugging tools.
The first thing I did was try to use
dotnet-trace
to track down what was going on in the language server. There was one clearly active thread, but unfortunately the output seems to have a limit on call-stack depth (every stack on this thread has the exact same depth), so the bottom of the stack is missing:Neither speedscope nor chromium could deal with these incoherent stacks, so I gave up on this approach.
Next I decided to capture the input to the language server so I could replay it. I started the language server and then entered a bunch of characters in a large file. This resulted in the locking behaviour and high CPU. I had to split the input into two chunks and delay the second part until after initialisation. Piping it to the language server also reproduced the high CPU usage.
The part where it locked up seemed to be a bunch of consecutive
completion
operations that were being started and cancelled:I started adding stderr prints to omnisharp-roslyn and csharp-language-server-protocol to try to figure out how it reads and responds to request.
I found that in
InputHandler
the request dispatch seemed to take a long time for the completion operations:This was blocking the thread from reading ahead in stdin, including (critically) the cancellation requests. It was acting as if the request was synchronous.
I found
OmniSharpCompletionHandler
, which asynchronously handles the request, and started adding prints to figure out which thread things were running on. I found that at least the bulk of the completion operation was running on the same thread asInputHandler
, which explained the blocking.I then found the place where the async operation gets run under Rx, which is in
DefaultRequestInvoker
:I tried passing various scheduler instances to
FromAsync
, but I ran into a problem where the operations would no longer complete. This should be reproducible simply with:I still don't really understand this. It seemed to have something to do with
Amb
and how the async operations are started. For some reason, rearranging theAmb
(to have FromAsync first) fixed it:With these changes, there was much less blocking in
InputHandler
, so the editor experience was better.I'd still occasionally get operations running on the same thread as
InputHandler
, so I tried adding this hack:This does seem to help, but it feels wrong, so I thought it was time for me to check in and see if anyone has any thoughts on all of this.
P.S. I did also solve some problems with cancellation of certain types of requests (e.g. completion), but that's a subject for an omnisharp-roslyn issue.
The text was updated successfully, but these errors were encountered: