Reducing SharePoint Framework Code Smells: 5 – Connecting Azure DevOps to SonarQube

This is a series on how to set up SonarQube as a Quality Gate in your SharePoint Framework development process. The end goal is to add SonarQube to your build and release process through DevOps. These articles will explain:

  1. How to set up a sample SonarQube server in Azure
  2. Setting up a unit test sample locally
  3. How to set up sonar-scanner and connect it to SonarQube
  4. Configuring Sonar Scanner to test only our code

Introduction

In the previous article we saw how to set up a sonar scanner to only scan the code we actually wanted to be tested. In this final article in  the series we will look at adding Sonarqube to Azure DevOps and then how to hook it into our Build process.

Adding SonarQube to Azure DevOps

The documentation for SonarQube and Azure DevOps can be found here

https://docs.sonarqube.org/latest/analysis/scan/sonarscanner-for-azure-devops/

Adding SonarQube to Azure DevOps can be done through the marketplace.

Once added by an administrator you can then add a “Service Connection” for Sonar Qube through the administrative section

Once added then Sonar Scanner and Sonarqube call can be added to your build process.

Adding Sonar Scanner to your build process

When adding a new step to the build process, searching for sonar brings up all the new capabilities and options.

We need to add Prep, run and publish to the build process.

When adding the prep analysis stage we are basically running sonar scanner. In the advanced box you would add all the configurations we talked about in the previous article (from the CLI)

  • sonar-scanner.bat
    -D”sonar.projectKey=IceCreamShop”
    -D”sonar.host.url=http://xominosonarqube.azurewebsites.net”
    -D”sonar.login=dba68d82c931efe82e8692c9a25bc7c31736b286″
    -D”sonar.sources=src/webparts/iceCreamShop”
    -D”sonar.tests=src/webparts/iceCreamShop”
    -D”sonar.typescript.lcov.reportPaths=jest/lcov.info”
    -D”sonar.exclusions=src/webparts/iceCreamShop/test”
    -D”sonar.test.inclusions=**test.ts,**test.tsx”

although in this case we are not passing them into the CLI the -D is unnecessary.

The run code analysis step actually runs the sonar-scanner, and the publish quality gate review step allows the response from SonarQube to be understood and processed as part o fhte build process.

For more information on the plugin you can find it here

https://docs.sonarqube.org/latest/analysis/scan/sonarscanner-for-azure-devops/

  • Prepare Analysis Configuration task, to configure all the required settings before executing the build.
    • This task is mandatory.
    • In case of .NET solutions or Java projects, it helps to integrate seamlessly with MSBuild, Maven and Gradle tasks.
  • Run Code Analysis task, to actually execute the analysis of the source code.
    • This task is not required for Maven or Gradle projects, because scanner will be run as part of the Maven/Gradle build.
  • Publish Quality Gate Result task, to display the Quality Gate status in the build summary and give you a sense of whether the application is ready for production “quality-wise”.
    • This tasks is optional.
    • It can significantly increase the overall build time because it will poll SonarQube until the analysis is complete. Omitting this task will not affect the analysis results on SonarQube – it simply means the Azure DevOps Build Summary page will not show the status of the analysis or a link to the project dashboard on SonarQube.

Conclusion

As we have seen in these series of articles, adding sonarqube to your build process can help inrease the quality of your code by providing insights through analysis which woudl not otherwise haev been apparent.

It’s free to use, so why not have a play. 🙂

 

Reducing SharePoint Framework Code Smells: 4 – Configuring Sonar Scanner to test only our code

This is a series on how to set up SonarQube as a Quality Gate in your SharePoint Framework development process. The end goal is to add SonarQube to your build and release process through DevOps. These articles will explain:

  1. How to set up a sample SonarQube server in Azure
  2. Setting up a unit test sample locally
  3. How to set up sonar-scanner and connect it to SonarQube
  4. Configuring Sonar Scanner to test only our code

Introduction

In the previous article we saw how to set up a sonar scanner locally and run a scan of our out of the box sample project. The code was a little smelly and there were some bugs found. In this article we are going to look at where those bugs come from and why they are not relevent to our solution. By the end we will have a working Sonar Scanner correctly scanning our code and running tests.

Looking at the bugs and smells

Part of the reason why I left this in here was to see some of the beauty of SonarQube and the information it provides us. How does it know there are bugs? What smells?

Looking at our report closer we can click on the project and then the Bugs and we see the following bugs.

There are a couple of important things to note here:

  1. The bugs are coming from the JEST folder
  2. It checked .css as well as .js

Clicking on the See Rule we can see why it decided this was a bug.

SonarQube runs off pre-built rules (like a linter does) and determines that the quality of the code is a “bug” because it breaks a rule. This is really interesting and drives us to create better code – which is ultimately to goal of using the capability!

looking at the bug shows you where in the code – pretty cool eh?

You can poke around with the code fails as well, that’s not why we are here – but it is fun to look around.

These two files are in the JEST folder which is not “our” code and therefore there is no need to test it. So let’s configure the scanner to look at “our” code.

Configuring sonar scanner via CLI

If you are running sonar-scanner locally you can change the sonar.properties file to add some of the default values. In this case we are going to coninue to pass the parmeters in through the CLI because we are not going to be able to do that when we go to the CI/CD pipeline in Azure DevOps.

We are adding new parameters

  1. updating sonar.sources to point to our code
  2. adding sonar.typescript.lcov.reportPaths so that SonarQube can use the coerage reports created by Jest
  3. adding sonar.tests so that we are specific as to where our test are located
  4. adding sonar.exclusions so that we are not assessed for coverage on our tests, which would be silly
  5. adding sonar.test.inclusions to identify which files are our “tests”
  • sonar-scanner.bat
    -D”sonar.projectKey=IceCreamShop”
    -D”sonar.host.url=http://xominosonarqube.azurewebsites.net”
    -D”sonar.login=dba68d82c931efe82e8692c9a25bc7c31736b286″
    -D”sonar.sources=src/webparts/iceCreamShop”
    -D”sonar.tests=src/webparts/iceCreamShop”
    -D”sonar.typescript.lcov.reportPaths=jest/lcov.info”
    -D”sonar.exclusions=src/webparts/iceCreamShop/test”
    -D”sonar.test.inclusions=**test.ts,**test.tsx”

Running the sonar-scanner.bat command again we get some slightly different results this time……

MUCH better 🙂

Loking at the code coverage section we can see that there are some additional files which have been added that are not really what I wanted to test.

What isn’t covered?

Looking into the files which failed is really helpful because you can visually see where the lines with no coverage are !!

Conclusion

In this article we have see how to configure sonar scanner to scan our files and not the dependancies. In the next article we will see how to connect to SonarQube from Azure DevOps

 

 

Securing your AzureDevOps SharePoint tenant credentials with an Azure key Vault.

If you are following an automated Build and Release process for your SharePoint Framework then you will have come across the need to store your tenant SharePoint admin username and password as variables in the pipeline.

Whle this works and I believe the credentials are encrypted, this is not going to fly with enterprise corporate security. They are going to insist that the credentials are kept centrally in a secure KeyVault. Conveniently for us, a KeyVault is available for us to use in Azure.

Using the process described by the Azure DevOps Labs team you can set up a KeyVault and integrate it into your pipeline.

I am adding the KeyVault pipeline into an older version of an SPFx release (for the most up to date doc check this post out).

Once that is run the new password is successfully utilized instead of the variable I had stored within Azure DevOps.

 

Fixing SPFx node-sass binding error on ADO release pipeline

When trying to run the gulp upload-to-sharepoint  encountered the following issue when creating a release pipeline for an SPFx web-part. There was a problem with no binding available for node-sass

[command]C:\NPM\Modules\gulp.cmd upload-to-sharepoint –gulpfile D:\a\r1\a\build\release\gulpfile.js –ship –username *** –password *** –tenant mckinseyandcompany –cdnsite sites/apps/ –cdnlib ClientSideAssets
2019-06-12T14:51:53.5954467Z [14:51:53] Working directory changed to D:\a\r1\a\build\release
2019-06-12T14:51:54.5490645Z D:\a\r1\a\build\release\node_modules\node-sass\lib\binding.js:15
2019-06-12T14:51:54.5497252Z throw new Error(errors.missingBinary());
2019-06-12T14:51:54.5498022Z ^
2019-06-12T14:51:54.5498662Z
2019-06-12T14:51:54.5499258Z Error: Missing binding D:\a\r1\a\build\release\node_modules\node-sass\vendor\win32-x64-48\binding.node
2019-06-12T14:51:54.5499538Z Node Sass could not find a binding for your current environment: Windows 64-bit with Node.js 6.x
2019-06-12T14:51:54.5499731Z
2019-06-12T14:51:54.5499883Z Found bindings for the following environments:
2019-06-12T14:51:54.5500034Z – Windows 64-bit with Node.js 8.x

and the error was actually staring us in the face – “binding available for Node 8″……..

The solution, just like for the build process, you have to add an agent task to ensure the correct version of node is used for the release process.

Using npm ci as part of the SPFx CI CD process through Azure Dev Ops

During the Automated Build and Deploy process for a SharePoint Framework Web Part (as documented here) one of the steps you go through to install the application on the build server is a familiar step ‘npm install’.

This works just fine when working locally and should be, but it is inefficient as part of an automated build process.

For a good explaination of why, check out this stackoverflow answer https://stackoverflow.com/questions/52499617/what-is-the-difference-between-npm-install-and-npm-ci/53325242#53325242

npm install reads package.json to create a list of dependencies and uses package-lock.json to inform which versions of these dependencies to install. If a dependency is not in package-lock.json it will be added by npm install.

npm ci (named after Continuous Integration) installs dependencies directly from package-lock.json and uses package.json only to validate that there are no mismatched versions. If any dependencies are missing or have incompatible versions, it will throw an error.

In my experience this can speed up the build process by more than 50% and as the npm install is the rate determining step for the overall buil, this is very helpful.

The step in the process for the build should look like this:

I have submitted a pull request to update the documentation and we will see if it is worthy 🙂