Troubleshooting the analysis
See also the Troubleshooting section on the corresponding Scanner page.
Viewing the analysis progress status
During analysis, data is requested from the server, the files provided to the analysis are analyzed, and the resulting data is sent back to the server at the end in the form of a report, which is then analyzed asynchronously server-side.
Analysis reports are queued and processed sequentially, so it is quite possible that for a brief period after your analysis log shows completion, the updated values are not visible in your SonarQube project. However, you will be able to tell what's going on because an icon will be added on the project homepage to the right of the project name. Mouse over it for more detail (and links if you're logged in with the proper permissions).
The icon goes away once processing is complete, but if analysis report processing fails for some reason, the icon changes:
Out of memory error
If your analysis errors out with java.lang.OutOfMemoryError: GC overhead limit exceeded
then it means that your project is too large or too intricate for the scanner to analyze with the default memory allocation. To fix this you'll want to allocate a larger heap (using -Xmx[numeric value here]
) to the process running the analysis. Some CI engines may give you an input to specify the necessary values, for instance if you're using a Maven Build Step in a Jenkins job to run analysis. Otherwise, use Java Options to set a higher value. Note that details of setting Java Options are omitted here because they vary depending on the environment.
You can also add an exclusion to manage the files and folders you don't need to analyze by limiting your Analysis scope. Additionally, using a Solid-state drive or something similar will help speed up the analysis process and thus use less memory, especially for small file access.
PKIX path building failed
If your analysis errors out with PKIX path building failed
then it means that your SonarQube server is configured with HTTPS and a self-signed SSL certificate (see Securing the server behind a proxy in Operating the server). However, the certificate is not correctly configured in the scanner machine’s JVM. This configuration is outside of SonarQube scope. The server certificate is unknown and could not be validated with the provided truststore. To solve the issue, you need to import the SonarQube server certificate to the Java truststore. See Oracle's documentation for more information.
Various error messages
The format of the analysis property sonar.token= is invalid
You may encounter this issue when using SONAR_TOKEN as a secret in a calling workflow in GitHub Actions in case the called workflow doesn't manage to read it as a secret. In that case, make sure that the secret is inherited from the calling workflow (you may use the secrets: inherit
keyword). See GitHub documentation for more information.
The maximum number of open files was reached
On a Linux system, see Configuring the maximum number of open files and other limits in Pre-installation steps on Linux.
On a MacOS system, see Configuring the maximum number of open files in Pre-installation steps on macOS.
Error when analyzing files with non-ASCII characters in the name
When analyzing files with non-ASCII characters in the name, if the Malformed input or input contains unmappable characters
error is raised then you should make sure that the environment variables LC_ALL
and LANG
are properly set before running the analysis as shown below.
No coverage data in pull request analysis report with AWS CodeBuild
Verify that AWS CodeBuild LOCAL_SOURCE_CACHE
feature is disabled.
Failed to upload analysis report on cloud platform
If you encounter the "SonarQubeAnalyze fails at upload report - error POST 403 - Failed to upload: You’re not authorized" error and you're running SonarQube on a cloud platform, check that the cloud environment’s firewall (WAF) configuration allows the upload. WAF rules can potentially block SonarQube APIs, including the report submission.
Was this page helpful?