Release flow implementation with SAST/DAST on Azure DevOps

Today, we will take about one of the most popular product delivery flow, commonly used in small and medium size projects. This is release flow. Release flow origins from main stream flow, where there is only one main branch and short living feature branches. In release flow, new release candidate cuts from main branch each specific interval of time, say each month or a few weeks. After, the release branch becomes the only point of truth for every environment in release management process. Of course, hotfix may happens, and fix must be applied to appropriate release branch and main stream branch only.

Release flow

The main benefits of such flow are:

– every release branch is isolated;

release frequency is high, each month or each week;

– feature branches are short living. It reduces operational efforts and makes it easy to manage a source repository. It might be critical in case if static analysis or semantic versioning runs on each commit;

– perfect fits to feature approach.

As an disadvantage:

hotfix process becomes a bit complicated;

– additional operational efforts needed to create and maintain release branch each time. BUT, might be easily automated

Implementation on hybrid infrastructure with Azure DevOps cloud server

Below is a very straightforward architecture diagram which is based on Azure DevOps cloud server and on-premises application host servers:

The solution consists of Azure DevOps Repo which stores an application source code. When a feature was tested and approved it comes to main branch. After, release manager creates release candidate (RC) branch. It results to new RC package is created in Azure DevOps Artifact store.

SAST and dependency analysis

Doing RC build, a good practice is to execute additional tests like SAST or dependency analysis. I use Mend unified agent to perform dependency analysis. It needs a little configuration on server side and Java environment. The unified agent runs as standalone installation and send statistics directly on cloud server.

You have to purchase apiKey first.
jobs:
- job: CI

- ${{ if eq(parameters.skipMend, false) }}:
    - task: DotNetCoreCLI@2
      displayName: 'Restore'
      inputs:
        command: 'restore'
        projects: |
          **/${{ parameters.solutionName }}.sln
        verbosityRestore: Normal
        
    - task: CmdLine@2
      inputs:
        script: |
          "C:\Program Files\AdoptOpenJDK\jdk-11.0\bin\java.exe" -Xms1024M -Xmx1024M -jar D:\ADO\whitesource\wss-unified-agent.jar -apiKey $(SHARED-MEND-APIKEY) -product PRODUCT -project $(Build.DefinitionName) -wss.url https://app-eu.whitesourcesoftware.com/agent
        workingDirectory: $(System.DefaultWorkingDirectory)
      displayName: Mend Scan
      condition: succeeded()

 - ${{ if eq(parameters.skipSonar, false) }}:
    - task: SonarQubePrepare@5
      displayName: '[SonarQube] Prepare analysis'
      inputs:
        SonarQube: 'SonarQube Server'
        scannerMode: 'MSBuild'
        projectKey: ${{ parameters.projectKey }}
        projectName: ${{ parameters.projectName }}
        projectVersion: '$(Build.BuildNumber)'
        extraProperties: |
          sonar.cs.vstest.reportsPaths=$(Agent.TempDirectory)\**\*.trx
          sonar.cs.vscoveragexml.reportsPaths=$(Agent.TempDirectory)\**\*.coveragexml
          sonar.coverage.jacoco.xmlReportPaths="target/site/jacoco/jacoco.xml,target/site/jacoco-it/jacoco.xml,build/reports/jacoco/test/jacocoTestReport.xml"

As a static analyzer I leverage SonarQube (SQ). This is a standalone installation on on-premises environment. Cloud version might be used as well. SQ provides complete analysis according to project language and libraries used, and performs test coverage and duplicate code intelligence.

Release management

In this example, release management process made up integration test, pre-production and production environments. On each of them, there are a number of automated processes configured to ensure the quality and integrity of the application version.

name: "Release-$(rev:r)"

trigger: none

resources:
  pipelines:
  - pipeline: build
    source: CI
    branch: main

appendCommitMessageToRunName: false

parameters:
- name: Leg
  displayName: Deployment Leg (1 - BLUE, 2 - GREEN)
  values:
  - 1
  - 2
  default: 1

variables:
- name: TemplateFolder
  value: /Pipelines/templates/

stages:
- template: ${{ variables.TemplateFolder }}/integration.yml
- template: ${{ variables.TemplateFolder }}/pre-prod.yml
- template: ${{ variables.TemplateFolder }}/production.yml
  parameters: 
    Leg: ${{ parameters.Leg }}

This is important to note:

– the RC is immutable and never changes after been created, it goes through all of release managing environment in the same form and finally lands on production;

– no changes deployed without approval from appropriate release management;

– there might be additional automated conditional approvals based on Azure Monitor alerts or Azure DevOps Board status.

The production environment is usually consists of a pool of servers with load balancer on top. It increases availability, accessibility and process power. A good practice is using blue/green or canary deployment model on production. The practice significantly reduces a down time and provides quick way to failover.

DAST with OWASP ZAP docker container

DAST or dynamic analysis tests might be implemented on the same environment of integration tests or using completely isolated dedicated environment. In my case OWASP ZAP docker container has been installed on premises and runs against application version. Below is a pretty simple example how to use ZAP on docker and Azure DevOps:

# Run docker container
- bash: docker run -d -p <container_port>:<target_port> <your_image>
      displayName: 'Zap Container'

# Scan
- bash: |
    chmod -R 777  ./
    docker run --rm -v $(pwd):/zap/wrk/:rw -t owasp/zap2docker-stable zap-full-scan.py -t http://$(ip -f inet -o addr show docker0 | awk '{print $4}' | cut -d '/' -f 1):<container_port> -x xml_report.xml true
  displayName: 'Zap Scan'

# Prepare report
- powershell: |
    $XslPath = "<repo_name>/xml_to_nunit.xslt" 
    $XmlInputPath = "xml_report.xml"
    $XmlOutputPath = "converted_report.xml"
    $XslTransform = New-Object System.Xml.Xsl.XslCompiledTransform
    $XslTransform.Load($XslPath)
    $XslTransform.Transform($XmlInputPath, $XmlOutputPath)
  displayName: 'Prepare Report'

# Publish report
- task: PublishTestResults@2
  displayName: 'Publish Test Results'
  inputs:
    testResultsFormat: 'NUnit'
    testResultsFiles: 'converted_report.xml'

For more information please follow an awesome article by CircleCI.

It was quite straightforward release management solution based on git release flow, Azure DevOps cloud and on-premises infrastructure. I showed you how to use static analysis tools like Mend and SQ in AZDO pipelines and OWASP ZAP in docker container for dynamic penetration testing.

If you like the post, please subscribe to newsletter below to do not miss latest updates.

Be an ethical, save your privacy!

subscribe to newsletter

and receive weekly update from our blog

By submitting your information, you're giving us permission to email you. You may unsubscribe at any time.

Leave a Comment

Discover more from #cybertechtalk

Subscribe now to keep reading and get access to the full archive.

Continue reading