Hello and thanks for the great project.
I was a bit surprised to find out that untracked file was uploaded to public internet. This time there was no harm done, but it might have been a file with passwords or other sensitive information related to the project.
Also I was a bit surprised to see that submodules of project don’t get analyzed, while this is desired behavior I’d thought that they’d go through the same process as untracked files.
ALM used (GitHub, Bitbucket Cloud, Azure DevOps)
GitHub
CI system used (Bitbucket Cloud, Azure DevOps, Travis CI, Circle CI
None
Scanner command used when applicable (private details masked)
I understand your concern, and we should probably clarify something in our documentation and in the onboarding process of new users/projects: SonarCloud is designed to be used as part of CI/CD processes. This means that we don’t expect users to run analyses manually on their own, but to extend their existing CI/CD pipelines - which de facto fixes the problem of untracked files. In other words, we expect users to run SonarCloud analyses only on repositories that are freshly cloned by a CI tool/service. Does that clarify?
I understand the reasoning, and I’ve already added the projects to Travis pipeline where this issue doesn’t exist. However I’d guess that almost everyone will run Sonar locally before going through the trouble of updating their CI pipeline, and it’s a nasty surprise to notice that something unintended is public on the Internet.
Returning an error if untracked are files present under folders being scanned would be better behavior in my humble opinion. As you said yourself, the expected use-case is running the scanner on a clean repository where this issue doesn’t exist. I’m also having hard time coming up with a use-case where user would actually want to upload their untracked files instead of adding those files to source control first.