Thanks for creating the thread.
Exactly how VS runs Roslyn rules and which processes it uses isn’t documented and has changed over time, although it does try to be smart about it: if you’re editing a file it will wait until you’ve stop typing before starting to run analysis on the changes; if you start typing again it will stop the analysis (since any results might have been invalidated by the new changes). This does mean some CPU will be used in the background, but I haven’t observed long periods of high CPU usage in that process before.
As users we don’t have a lot of control over how or when VS performs live code analysis either. Turning off full solution analysis will reduce the load on the machine, but then you’ll only see issues in open code files.
If you using the experimental out-of-process analysis feature, you might want to turn that off too. Both of those are settable from the Tools, Options, Text Editor, C#, Advanced:
Full solution analysis used to be on by default in previous versions of VS but now defaults to off because of the performance impact.
The only other thing you can do is reduce the number of rules that are being run by creating a ruleset that disables some rules. Most individual Roslyn analyser rules don’t have much of an impact on performance. We have a number of rules that use our symbolic execution engine to detect more complex issues. This involves doing data-flow analysis and are more expensive in terms of processing time and memory, but I still wouldn’t expect them to cause long periods of high CPU, particularly if you have full solution analysis turned off.
FYI our symbolic execution rules are S4158, S3966, S3900, S3655, S2259, S1944, S2583 and S2589. If you do try disabling these rules and find it makes a difference on your machine please let us know.
Upgrading to VS2019 might well make a difference: the VS platform team are actively trying to improve the performance and responsiveness of VS so it’s possible that the way analysers are run has been improved.