Reducing SharePoint Framework Code Smells: 1 – Setting up SonarQube in Azure

This is a three part series on how to set up SonarQube as a Quality Gate in your SharePoint Framework development process. The end goal is to add SonarQube to your build and release process through DevOps. These three articles will explain:

  1. How to set up a sample SonarQube server in Azure
  2. How to run a code review manually
  3. How to integrate the code review into your Azure DevOps build and release process.

As part of a quality development process not only should developers be linting their code, running unit test and so forth, another step in the process which can be added is a “Code Quality” check using the open source project SonarQube.

In this article we will see how to create a stand alone sample SonarQube server in Azure (and locally if you really want as well).

Introduction

“SonarQube provides the capability to not only show health of an application but also to highlight issues newly introduced. With a Quality Gate in place, you can fix the leak and therefore improve code quality systematically.” 

In practice what it means is an additional tool which developers can use to write better, more maintainable code. This increases quality and reduces overall maintenace costs when implemented as part of a continuous build and deploy process.

There are plugins for JavaScript and TypeScript and thus makes this very applicable to SharePoint Framework development.

Setting up the server

The first step is to create a SonarQube server upon which your code can be reviewed. Some VERY nice person by the name of vanderby has created an ARM template to “Deploy Sonar Cube to Azure“. It is limited by using an embedded database, but it will at least show you the basics before you are ready to scale this properly.

As the github page states – it does take a while to get started but once it is up you can start to use it.

To log into the server I used admin/admin. As this is a sample setup it doesn’t really matter.

Creating a project

Once you are set up and running you can create a project and a key which can then be used to access the server from a command line interface (CLI).

Under the administration server create a new project and once that is complete generate a key for your project

Using these credentals we can test out code from the command line.

Conclusion

Setting up a sample SonarQube server in Azure is pretty simple. As it states though this will not scale and if you are going to use this in an enterprise it will need some better set up. But for the sake of demonstration, it’s just fine.

In the next article we will look at how to apply this to an Azure DevOps build and deploy process for SPFx.

 

Note

You can just as easily set up your own local SonarQube server by following the 2 minute set up installation instructions

Get Started in 2 minutes

 

PSC Tech Talks – A Journey to the Programmable Data Center

During this PSC tech Talk Geremy Reiner gave us an overview on his “Journey to the Programmable Data Center”. The emphasis of the presentation was not on the technologies involved, but on the concepts and processes which enable infrastructure to be deployed as code and then from there what business solutions become enabled by the infrastructure.

Background

There is more to innovation that technology for the sake of technology. When asking the question why should we build programmable datacenters the answer is much more than “because the technology is better”. We need to consider how a modern datacenter:

  • Provides a business focused approach to infrastructure
  • Simplifies datacenter management
  • Increases speed of delivery
  • Extends benefits of automation and orchestration

Datacenter Ascendancy

As technology has evovled, so has the way we use it to solve business problems. But technology is not the only thing which has to evolve to be able to maximize the cost reduction and productivity gains which a modern datacenter can provide. The organization has to embrace the new capabilities as well.

A traditional datacenter is stable secure and reliable but to achieve that it has a large footprint, it is generally utilized at only 20% of capacity on average, has a high management cost and is very expensive to scale.

A virtual datacenter has increased scalability, can be managed at a computer not at a rack terminal, is generally utilized at 50% or greater and is much quicker to stand up a new capability.

Cloud computing or “IT as a service” uses highly automated self service portals, the abstraction of infrastructure creation and “click of a button” deployment of managed services . With a global footprint, the capacity on demand model now allows business to plan for the future without having to make large CAPEX investments and planning for its needs for the next 5 years.

Organizational maturity

As the organization matures so can the technology. When the needs of the business can be reflected in a truly self service manner where everything from a new site to a new templated service can be deployed with nothing more than a set of configuration parameters and a button, the automated datacenter comes into its own.

 

Software defined datacenter

Geremy went into more depth about what a programmable datacenter is composed of. From application, to automation, to infrastructure, all with business oversight the modern architected datacenter provides visibility at all levels.

 

So then what?

With all this in place, Geremy then got into the real business benefits, with examples, of where the modern data center enables business flexibility, cost saving, speed to market etc.

Process automation

When we talk process automation at a high level we are generally talking about frameworks like ITIL which cover the best practices for delivering IT services. How we respond to the needs of the users, outages and other unplanned issues requires the ability to know what is going on at any time and to be able to respond to it in a repeatable manner.

In a modern datacenter that is generlly an amazingly well defined automated process.

If a service looks like it is not responding as expected, a new instance of the service is spun up, the necessary configuration changes are made to direct traffic to the new service and the old one is turned off – automatically. The end goal is for this to be seemless to an end user.

Continuous Delivery 

The modern datacenter enables us to create business enables “DevOps” capabilities whereby not only is code tested automatically, the infrastructure enecessary to run the test on, is created programmatically at time of testing. Servers and test suites are stood up and then broken down (or turned on and then turned off) as necessary. This level of automation allows high productivity but keeps costs down for the business.

Azure Resource Management (ARM) templates

There are configuration stanadards for being able to describe how your infrastructure should be created, deployed, sized and run. This can make a sizeable difference to being able to deploy capabilities for your business.

As an example, if you wanted to go from zero capability to a deployed SharePoint farm with SQL server and supporting services, you would be looking at a quarter to half million dollar’s worth of capital investment in hardware and infrastructure, months of planning, service creation, setup and configuration and then installation of the software.

With ARM you can literally deploy the entire sharepoint stack within 15 minutes hosted on Azure, using 9 servers, with the click of a button. At the time of the presentation this build would have cost approx $5000 a month. The cost benefits are clear and significant.

To help get orgnaizations get started with using Azure, Microsoft has created many open source ARM templates and posted them on GitHub for general consumption and improvement. They can be downloaded, configured for personal needs and you can be up and running within hours, not months.

 

Working in the real world

PSC worked with one of our clients to create a 19 server, repeatably deployable process for them, whereby they could sell their services to end customers. Through a web interface, the client team could answer questions on a form which in turn built the custom ARM template. The ARM template was programmatically used to automate the deployment of the necessary environment for the end client based on their requirements.

Conclusion

A modern data center is designed around what business need can it flexibly solve for end users, now and in the future rather than how it can rigidly support the business needs of the present past. PSC has proven experience in deploying infrastructure as a service using ARM templates, automated deployment and management of virtual infrastructure and utilizing modern datacenters to help our customers future-proof their technology needs.