SonarCloud | Advanced setup | Languages | C/C++/Objective-C

Was this page helpful?

Start FreeLog in

C/C++/Objective-C

C/C++/Objective-C analysis is officially registered as CWE Compatible.

Supported compilers

  • Any version of Clang, clang-cl, GCC, and Microsoft C/C++ compilers
  • Any version of the Intel compiler for Linux and macOS
  • ARM5 and ARM6 compilers
  • IAR compilers for ARM, Atmel AVR32, Atmel AVR, Renesas H8, Renesas RL78, Renesas RX, Renesas V850, Texas Instruments MSP430, and for 8051
  • QNX compilers
  • Texas Instruments compilers for ARM (armcl and tiarmclang), C2000, C6000, C7000, MSP430, and PRU
  • Wind River Diab and GCC compilers
  • Microchip MPLAB XC8, XC16, and XC32 Compilers
  • Compilers based wholly on GCC, including Linaro GCC

Note that statically linked compilers on Linux and macOS are not supported through Build-Wrapper. For example, some versions of Texas Instruments compilers on Linux.

Supported language standards

Please check the C and C++ rows in the supported Languages table for an up-to-date list of supported versions.

Supported runtime environments

  • Microsoft Windows on x86-64
  • Linux on x86-64
  • macOS with version 10.14.3 and later on x86-64, and Apple Silicon

Prerequisites

SonarScanner

Analysis of C/C++/Objective-C projects requires the SonarScanner CLI.

Build configuration

For a C/C++/Objective-C analysis to be accurate, the analyzer needs to understand how the code is meant to be compiled. Compilation options, like macro definitions and include directories, can have a huge impact on the generated code and consequently on the analysis results. This is done by providing a Compilation Database to the analyzer.

Compilation Database is a JSON file format introduced by the LLVM project. It contains the compile commands used to build a project.

There are two alternative ways to generate the compilation database:

  • SonarSource Build Wrapper
  • Third-Party tools

Choosing the right tool

The general recommendation is to use Build Wrapper unless you have a good reason not to.

Reasons to use Build Wrapper

  • Build Wrapper enforces running the build before the analysis, which ensures that the code is in a good shape for analysis: the code is compilable, the configuration file is not outdated, and the generated source files are available during the analysis.
  • Your build relies on environment variables, which can only be captured by using Build Wrapper.
  • It is recommended and maintained by Sonar.

Reasons to use third party tools

  • Your build system is not supported by Build Wrapper.
  • You want to use a third-party tool that, unlike build-wrapper, does not require a clean build to generate a compilation database
  • You already generate and use a reliable Compilation Database in your CI pipeline

Generating the compilation database

Using Build Wrapper

Analysis configuration example projects with Build Wrapper are available on GitHub.

Build Wrapper is a tool developed by SonarSource that generates a compilation database, capturing your build configuration, at build time. To run Build Wrapper, you should prepend your clean build command with the Build Wrapper executable.

When you wrap your build command with the Build Wrapper, it will run the given command and gather all the configuration required for a correct analysis of C/C++/Objective-C projects, such as macro definitions and include directories. Build Wrapper does not impact your build; it merely monitors it and writes what it learns into files in a directory you specify.

You should download Build Wrapper directly from SonarCloud:

Unzip the downloaded Build Wrapper and configure it in your PATH because doing so is just more convenient.

Execute Build Wrapper as a prefix to your usual clean build command. A clean build command should always build the project from scratch. At the end of your build, a compile_commands.json file should be generated in the specified output directory. This file contains information about the compilation units that were built by your build command. 

Any file that doesn't end up in a compiled compilation unit will not be analyzed. As a consequence, source files that are not compiled and header files that are not included in any compiled source file will not be analyzed.

Note that build-wrapper supports ccache. This can be used to speed up clean builds by caching previous compilations and detecting when the same compilation is being done again.

The examples below use make, xcodebuild, and MSBuild, but any build tool that performs a full build can be used: 

Linux
build-wrapper-linux-x86-64 --out-dir build_wrapper_output_directory make clean all
macOS
build-wrapper-macosx-x86 --out-dir build_wrapper_output_directory xcodebuild clean build
Windows
build-wrapper-win-x86-64.exe --out-dir  build_wrapper_output_directory MSBuild.exe /t:Rebuild /nodeReuse:False

Important notes

  • Build Wrapper collects information about the build, including absolute file paths (source files, standard headers, libraries, etc...). Later on, SonarScanner uses this information and needs to access those paths. While this is straightforward while running these two steps on the same host, it is worth considering when using any sort of containerization. A consequence of this is that C / C++ / Objective-C analysis is NOT supported by SonarScanner CLI Docker image.
  • Build Wrapper generates three files in its output directory: build-wrapper-dump.json, compile_commands.json, and build-wrapper.log. All these files contain a dump of the environment, and this can be a security concern in some contexts.
  • Build Wrapper does not support statically linked compilers on Linux and macOS, for example, some versions of Texas Instruments compilers on Linux.

Building with Bazel

Bazel recommends that you use the --batch parameter when running in a continuous build context. When using Build Wrapper, you are in such a context.

Also, you need to deactivate the sandbox mechanism of Bazel so that the compiled file paths can be retrieved after the compilation phase.

Here is an example of the Build Wrapper command with Bazel parameters on macOS:

build-wrapper-macosx-x86 --out-dir bw bazel
  --batch
  build
  --spawn_strategy=local
  --strategy=Genrule=local
  --bazelrc=/dev/null
  //main:hello-world

Building with MsBuild

Instead of starting new nodes when building your code, MsBuild can reuse previously launched build nodes. In that case, the Build Wrapper will not be able to monitor files compiled on these nodes. Therefore, we advise disabling this feature by using the nodeReuse:False command-line option.

Using third-party tools

Depending on the build system you are using, there are also some third-party options that can be used to generate a compilation database.

Some examples

Some examples:

  • CMake by setting the option CMAKE_EXPORT_COMPILE_COMMANDS
  • Ninja by setting the compdb flag
  • XCode through Clang’s -gen-cdb-fragment-path feature:
# Add the following "OTHER_CFLAGS" option to the xcodebuild command
xcodebuild clean build <additional args> OTHER_CFLAGS="\$(inherited) -gen-cdb-fragment-path \$(PROJECT_DIR)/CompilationDatabase"
# After the build, aggregate the fragments into "compile_commands.json"
cd CompilationDatabase && sed -e '1s/^/[\'$'\n''/' -e '$s/,$/\'$'\n'']/' *.json > ../compile_commands.json && cd ..
  • Clang using the -MJ option. Note that this will generate a compilation database entry by input. The merge of all entries can be done through something like sed -e '1s/^/[\'$'\n''/' -e '$s/,$/\'$'\n'']/' *.o.json > compile_commands.json
  • Open source wrappers like Bear and Bazel compile commands extractor

Analysis configuration example projects that generate compilation databases using third-party tools are available on GitHub.

Important Notes

  • Make sure that the tool you are using generates the right compile commands. To do so, you should verify that the Compilation Database contains your actual build commands. Also, you can run one of the compilation commands and verify that it succeeds.
  • The environment in which you execute the analysis should be the same as the build environment; the analyzer may need to access the build-related environment variables. For example, when using the Microsoft Visual C++ compiler, make sure to execute the analysis from the same Visual Studio Developer Command Prompt you use to build your project. The command prompt sets some environment variables, like INCLUDE, that need to be set during the analysis.

Executing the analysis

Once you have generated the compilation database file, you can proceed with the following steps to run the analysis:

  • Add the property sonar.cfamily.compile-commands in the sonar-project.properties file at the root of your project. You should set it to the path of the Compilation Database file relatively to the project directory (compile_commands.json in these examples). Here's a sample sonar-project.properties configuration:
sonar.projectKey=myFirstProject
sonar.projectName=My First C++ Project
sonar.projectVersion=1.0
sonar.sources=src
sonar.cfamily.compile-commands=compile_commands.json
sonar.sourceEncoding=UTF-8
sonar.host.url=YourSonarCloudUrl
  • It is recommended to gather all your code trees in a subdirectory of your project to avoid analyzing irrelevant source files like third-party dependencies. You can specify this subdirectory by setting the property sonar.sources accordingly. In this example, we named it src.
  • Execute the SonarScanner CLI (sonar-scanner) from the root directory of your project: sonar-scanner
    For more SonarScanner CLI related options, consult SonarScanner CLI.
  • Follow the link provided at the end of the analysis to browse your project's quality metrics in the UI.

Analysis of several variants of your code

You can analyze different variants of the very same version of your code (different target architectures, different build options...) and see aggregated results on the server. All variants are analyzed in the same environment and during the same analysis. We do not support analyzing two variants that require the code to be built on two different machines.

You should define two properties:

  • sonar.cfamily.variants.names, which contains a comma-separated list of the names of all the variants that need to be analyzed.
  • sonar.cfamily.variants.dir, which contains a path to a folder with one subfolder per variant. Each subfolder should contain a compile_commands.json file.

These properties are mutually exclusive with sonar.cfamily.compile-commands.

The following example defines three variants:

sonar.projectKey=myFirstProject
sonar.projectName=My First C++ Project
sonar.projectVersion=1.0
sonar.sources=src
sonar.cfamily.variants.names=Linux x64,Linux ARM,Linux ARM64
sonar.cfamily.variants.dir=compilation-databases
sonar.sourceEncoding=UTF-8
sonar.host.url=YourSonarCloud/SonarQubeURL

And the project folder structure should look like this: 

Project root

|- compilation-databases

|  |- Linux ARM

|  |  |- compile_commands.json

|  |- Linux ARM64

|  |  |- compile_commands.json

|  |- Linux x64

|  |  |- compile_commands.json

...

When analyzing code for several variants, in most cases, data coming from all variants will be aggregated. For instance, in the previous configuration, you might see that a particular issue was present in Linux ARM and Linux ARM64, but not in Linux x64.

Language-specific properties

Discover and update the C/C++/Objective-C specific properties in Administration > General Settings > Languages > C/C++/Objective-C

Unsupported properties

sonar.tests

sonar.tests property is used to identify the directories containing test source files. Identifying test files helps the analyzers to tune their rules. For example, the analyzers can enable test-specific rules and disable rules that don't make sense in the context of testing.

The C/C++/Objective-C analyzer currently analyze main and test source files the same way. Consequently, sonar.tests is not yet supported; the analyzer ignores it.

If you wish to analyze test source files, you should include them in the sonar.sources property.

Analysis cache

The C/C++/Objective-C analyzer automatically caches the analysis results on the server. The cached analysis results speed up subsequent analyses by analyzing the only things that have changed between the two analyses.

  • For branch analysis, an analysis cache is stored on the server per branch.
  • For pull request analysis, the cache of the base branch is used and not persisted.
  • If no cache is found, the cache of the Main branch is used.

However, in some cases, the analyzer can be configured to store the analysis results on the filesystem. You should consider changing the cache storage to the filesystem when the server cache size becomes a concern or when you want to optimize the cache lifecycle based on your project workflow.

You can customize the analysis cache storage with the following properties:

  • sonar.cfamily.analysisCache.mode=server|fs
    • server set the cache storage to the server, this is the default value.
    • fs set the cache storage to the local filesystem.
  • sonar.cfamily.analysisCache.path=relative_or_absolute_path_to_cache_location, this property should only be declared when the mode is set to fs. Please note that each project should use a dedicated path. To fully benefit from this feature, you should configure your CI system to persist the cache path between runs.

Parallel code scan

By default, the analyzer tries to parallelize the analysis of compilation units; it spawns as many jobs as logical CPUs available on the machine.

If required, it is possible to customize the number of scheduled parallel jobs by configuring the property sonar.cfamily.threads=n at the scanner level, where n is an integer indicating the maximum number of parallel jobs.

You should consider setting the sonar.cfamily.threads property only when the automatic detection of the number of logical CPUs cannot detect the desired number. A typical example is when the analysis should not consume all the available computing resources to leave room for other tasks running in parallel on the same machine.

When setting the sonar.cfamily.threads property, you should set it to a value less or equal to the number of logical CPUs available. Over-committing doesn't accelerate the analysis and can even slow it down.

Targeted C++ Standard

The analyzer targets a specific version of the C++ language to tune the rules in the activated quality profile. This reporting standard is used to:

  • Suppress rules that cannot be applied. For example, rules that suggest using C++20 features while the code is compiled with C++17.
  • Adjust rules' messages to suggest the proper fix according to the used standard. For example, a rule that suggest using std::ranges::any_of with C++20 will be tuned to suggest using std::any_of with older standard.

By default, the reporting standard version is the version that is used to compile the code. This is ideal for most projects. However, there are some edge cases where there is a need to use a reporting standard that is different from the compilation standard. For this reason, we provide the following scanner property to adjust the reporting standard:

sonar.cfamily.reportingCppStandardOverride=c++98|c++11|c++14|c++17|c++20

This property is only recommended to be used in the case where a project has to comply with a standard older than the one it is compiled with. This can happen if:

  • The compiler doesn't allow setting a specific standard. For example, MSVC doesn't allow specifying a standard older than C++14.
  • The project wants to be compiled with the latest standard while still being compliant with an older standard.

Solution with a mix of C# and C++

When you have a Solution made of C++ and C#, in order to both use Build Wrapper and have an accurate analysis of the C# code, you must use the SonarScanner for MSBuild. The SonarScanner for MSBuild does not handle sonar-project.properties files, so the Build Wrapper output directory will have to be set during the MSBuild begin step.

Note that in this scenario, source code stored in shared folders, not considered as a "Project" by Visual Studio, won't be scanned.

  • Download and install both the SonarScanner for MSBuild and Build Wrapper (see Prerequisites section).
  • Execute the SonarScanner for MSBuild begin step with the build wrapper output parameter /d:sonar.cfamily.compile-commands=<build_wrapper_output_directory>/compile_commands.json
  • Add the execution of Build Wrapper to your normal MSBuild build command
  • Execute the SonarScanner for MSBuild end step to complete the analysis

For example:

SonarScanner.MSBuild.exe begin /k:"cs-and-cpp-project-key" /n:"My C# and C++ project" /v:"1.0" /d:sonar.cfamily.compile-commands="build_wrapper_output_directory/compile_commands.json"

build-wrapper-win-x86-64.exe --out-dir build_wrapper_output_directory MSBuild.exe /t:Rebuild /nodeReuse:False

SonarScanner.MSBuild.exe end

Measures for header files

Each time we analyze a header file as part of a compilation unit, we compute for this header the measures: statements, functions, classes, cyclomatic complexity, and cognitive complexity. That means that each measure may be computed more than once for a given header. In that case, we store the largest value for each measure.

Language-specific rule tags

On top of the built-in rule tags, a few additional rule tags are specific to C/C++/Objective-C rules.

Some rules are relevant only since a specific version of the C++ standard. These rules will run only when analyzing a C++ code compiled against a later or equal standard version. The following tags are used to mark these rules for the corresponding C++ standard version:

  • since-c++11 
  • since-c++14
  • since-c++17
  • since-c++20

C++ rules not carrying any of these four tags started running since C++98. 

Implementation-related rule tags

  • full-project: this tag is for rules that do cross-compilation unit analysis. For these rules to work properly, it is important to analyze the entire project. Excluding part of the project from the analysis will impact the accuracy of these rules: it might lead to false-positives or false-negatives.
  • symbolic-execution: this tag is for rules that reason about the state of the program. They usually work together to find path-sensitive bugs and vulnerabilities. Once a fatal state of the program is reached, one issue will be raised, and the symbolic execution analysis of the current path will stop. For that reason, it is not recommended to evaluate these rules independently of each other as it might give a false sense of undetected issues. It is important to keep in mind that we are always working on improving these rules, as symbolic execution can never be perfect.

External standard rule tags

The following tags indicate how a rule relates to the MISRA guidelines:

  • based-on-misra: this tag is for rules that address the same kind of issues as MISRA rules, but that do not entirely correspond to what MISRA specifies (usually to make them less strict).
  • misra-c++2008
  • misra-c++2023
  • misra-c2004
  • misra-c2012

The following tags represent the category of the rule according to the MISRA compliance 2020 document and indicate whether violations of the rule are permitted (see Chapter 5: “The guideline re-categorization plan”):

  • misra-mandatory
  • misra-required
  • misra-advisory

Quality profiles

  • Like all other languages supported by SonarQubeSonarCloud, C, C++ and Objective-C come with the "Sonar way" profile. This is Sonar's recommended profile, it is designed to fit most projects.
  • For C++, we also provide the "Mission critical" quality profile. It is our recommendation for modern C++ development (C++17 and beyond) for mission-critical software. It is based on MISRA C++ 2023 and it trades more constraints on your code for more code safety.

© 2008-2024 SonarSource SA. All rights reserved. SONAR, SONARSOURCE, SONARLINT, SONARQUBE, SONARCLOUD, and CLEAN AS YOU CODE are trademarks of SonarSource SA.

Creative Commons License