C++ Sonar Analysis fails with "LLVM ERROR: out of memory"

C++ project Methane Kit runs Sonar Scanner analysis as a part of continous integration build in Azure Pipelines for every commit. Starting from commit 3391f5b in develop branch, SonarCloudAnalyze task has started failing on Windows with error:

##[error]LLVM ERROR: out of memory
LLVM ERROR: out of memory
##[error]ERROR: Exception in thread pool-4-thread-3
com.sonar.cpp.analyzer.Analyzer$AnalyzerException: Exit code != 0: D:\a\1\s\Modules\Graphics\Core\Sources\Methane\Graphics\DirectX12\ShaderDX.cpp
	at com.sonar.cpp.analyzer.Subprocess.execute(Subprocess.java:81)
	at com.sonar.cpp.analyzer.Subprocess.execute(Subprocess.java:78)
	at com.sonar.cpp.plugin.CFamilySensor.lambda$process$8(CFamilySensor.java:643)
	at com.sonar.cpp.analyzer.AnalysisExecutor.lambda$submit$0(AnalysisExecutor.java:59)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
	at java.base/java.lang.Thread.run(Thread.java:834)

MacOS and Linux analysis continued working normally, but Windows analysis now fails for the same source file ShaderDX.cpp. I enabled verbose output and saved the generated sonar-cfamily-reproducer.zip in build artifacts to help with your investigation. Also I double-checked that Sonar analysis works fine on the previous commit 206b822. Additionally I tried to run sonar scanner locally on my desktop and it worked just fine for the latest code (not suprising with 32 GB of memory on board). And I should say that I have no idea what code changes could result in this failure.

Hi @egorodet,

2021-01-17T16:25:25.9299273Z 16:25:25.646 INFO: Available processors: 2
2021-01-17T16:25:25.9299991Z 16:25:25.646 INFO: Using 4 threads for analysis according to value of "sonar.cfamily.threads" property.

in sonar-analysis-verbose-log I can see that you set 4 threads when only 2 processors are available. That means that memory is adapted for 2 processors, not for 4, could you give it a try setting the number of threads to 2? There is no gain to set the number of threads to a number bigger than the number of processors available.

I’ve tried changing sonar.cfamily.threads from 4 to 2, but unfortunately it did not help to fix “out of memory” error, see new build log. The reason why it was set to 4 processors is that Azure workers have either 2 or 4 processors available and there is no way to set it automatically.

Hi @egorodet,

what about if you try with a single thread?

Did you evaluate %NUMBER_OF_PROCESSORS% environment variable?

Tried single threaded analysis - it did not help either.

Ok, it seems like I found the code that was root-cause of the scanner error and was able to workaround the problem by rewriting the code. It was related to changes in Exceptions.hpp:

1-st Version of code which results in scanner error “out of memory”:

template<typename T, typename RawType = typename std::decay<T>::type>
class UnexpectedArgumentException : public ArgumentExceptionBase<std::invalid_argument>
{
public:
    UnexpectedArgumentException(const std::string& function_name, const std::string& variable_name, T value, const std::string& description = "")
        : ArgumentExceptionBaseType(function_name, variable_name,
                                    [&value]
                                    {
                                        if constexpr (std::is_enum_v<RawType>)
                                            return fmt::format("enum {} value {}({}) is unexpected", magic_enum::enum_type_name<RawType>(), magic_enum::enum_name(value), value);
                                        else
                                            return fmt::format("{} value {} is unexpected", typeid(RawType).name(), value);
                                    }(),
                                    description)
        , m_value(value)
    { }

    [[nodiscard]] RawType GetValue() const noexcept { m_value; }

private:
    RawType m_value;
};

2-nd version of code which fixes the problem and is scanned without errors:

template<typename T, typename RawType = typename std::decay<T>::type>
class UnexpectedArgumentException : public ArgumentExceptionBase<std::invalid_argument>
{
public:
    template<typename ValueType = RawType>
    UnexpectedArgumentException(const std::string& function_name, const std::string& variable_name, ValueType&& value, const std::string& description = "")
        : ArgumentExceptionBaseType(function_name, variable_name, GetMessage(value), description)
        , m_value(std::forward<ValueType>(value))
    { }

    [[nodiscard]] RawType GetValue() const noexcept { m_value; }

private:
    template<typename ValueType = RawType>
    static std::string GetMessage(ValueType&& value)
    {
        if constexpr (std::is_enum_v<ValueType>)
            return fmt::format("enum {} value {}({}) is unexpected", magic_enum::enum_type_name<ValueType>(), magic_enum::enum_name(std::forward<ValueType>(value)), std::forward<ValueType>(value));
        else
            return fmt::format("{} value {} is unexpected", typeid(ValueType).name(), std::forward<ValueType>(value));
    }

    RawType m_value;
};

So it was likely caused by that lambda in constructor of UnexpectedArgumentException class. While the problem is fixed for me, I still think that it is important to fix it in the scanner tool, so that it won’t appear in future for me or other users. Please create bug report with this information, I hope that it will help the development team to fix the tool.

Hello @egorodet,

I tried to reproduce your situation on my machine, using the reproducer file the you provided. Unfortunately, even without your change in the file, I do not reproduce this situation, and the peak memory usage for the analysis process was around 300kb. With your proposed change, it went down to… 299bk, so almost identical.

Do you have any idea of the peak memory when you run this locally on your machine? (you should monitor the processes named subprocess.exe, this is where the real work happens).

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.