We have a guide ( The SonarSource guide for investigating the performance of .NET analysis) which can help you identify long-running rules.
Please note that some of our advanced rules are finding tricky bugs and injection vulnerabilities, so even if they consume a lot of resources, they bring a lot of value. However we also have some maintainability rules which can consume a lot of resources on some particular codebase and which you can disable. Please feel free to discuss on this forum your findings and ask for guidance on how to improve the performance without losing a lot of value (the most performance-efficient way is not to run any static analysis at all , but you wouldn’t benefit of all the help it brings ).
Roslyn (the compiler framework for .NET) runs code analyzers in parallel, so it can consume a lot of resources. Also, to demystify a bit our analysis - to find injection vulnerabilities, our taint analysis engine loads in memory the control flow graph representation of the entire project, so it can consume quite some memory.