Getting started with Terraform

I’m becoming a fan of Terraform, when I started using at my work to manage our Azure environments and using it more recently with Azure Devops. Terraform is an infrastructure-as-code tool that allows to create and manage cloud based infrastructure in codified way. It is an open source tool that codifies APIs into declarative configuration files (*tf) that can be source control to version the changes.

Infrastructure as Code (IaC) is provisioning and managing computing infrastructure with declarative approach that can be source control to version the changes that you want to make your infrastructure.  Infrastructure-as-code, is a modern approach to managing infrastructure, and is called the “foundation for DevOps”.


Using Infrastructure as Code as part of your deployment process has a number of benefits to your workflow:

  • Consistency – The IoC will be consistent through source code management. This is good when it comes to rolling back if mistakes are made.
  • Speed – Creating infrastructure through declarative approach will be faster while comparing to manual navigation through interface.
  • Reusability –  Existing code can be reused for provisioning and deployment across multiple platforms.
  • Extensibility – Existing code can be extended for provisioning and deployment.

Terraform uses the concept of “providers”, which are like plugins which extend the capabilities of the tool, to interact with various cloud systems in a number of different ways.

Installing Terraform

I’m using Windows 64-bit.

Terraform is packed as zip archive. You can download it using chocolatey,  which is good too.

  • After downloading terraform, unzip the package.

  • Copy files from the zip to “c:\terraform” for example. That’s my terraform PATH. The PATH is the system variable that operating system uses to locate needed executables from the command line or Terminal window.

For Windows 8 

  • Search for : System (Control panel)

  • Click the Advanced system settings link.

  • Click Environment Variables.

  • In the section System Variables, find the PATH environment variable and select it. Click Edit. If the PATH environment variable does not exist, click New.

  • In the Edit System Variable window, append at the end of the PATH environment variable the value of terraform path as ”c:\terraform;”

  • Click OK, then OK, then OK

Verify Install

Once you have configured the terraform, you should be able to run terraform commands. Open up command prompt and type terraform. You should see the terraform usage appear.

Terraform Extension with Visual Studio Code

  • Once Installer is complete and can be opened from the Start Menu.
  • To install extensions Press F1 and start typing ext or (Ctrl + Shift + X)

  • Choose Extensions: Install Extensions
  • Install Terraform and Azure Terraform extensions

Now you’re ready to write some code!

Happy Coding!



Database Continuous Integration and Delivery (CI/CD) using Azure devops – Part 2

In my previous post Database Continuous Integration and Delivery (CI/CD) using Azure devops – Part 1

described how to create new, empty database project in Visual Studio. One of the ways how to import an existing database (from a server) was import database.

Database CI/CD Implementation

Now the your project in under source control, we can proceed to implement database continuous integration and delivery (CI/CD) pipeline.

Build definition

Navigate to your azure devops environment. Go to Pipelines in your project and create a new Build pipeline and choose New Build pipeline.

You can use different source control, based on your project requirement. I am using Github for this demo. Connect to your Git repository by using Service connection or using Github personal access token. Once you select the git repo, you can choose the default branch for your builds.

Select the template. I’ll be using Configuration as code through YAML, to make the code check-in in my source control. But you can choose out-of-box options from User Interface to run the build.

Create the build.yaml file in solution and browse the location to point to build.yaml file.

Having got this far, we can go ahead and select “Save”, and a new build will be created using the “DatabaseDemo” build definition we created earlier.

Create the steps in build.yaml file to build the solution and creating the artifacts after the build. The code goes as….

//Define the build agent
name: $(Rev:rr)
– repo: self
name: Hosted VS2017
– msbuild
//Define the visual studio build
– task: VSBuild@1
displayName: ‘Build solution **\*.sln’
msbuildArgs: ‘/p:DeployOnBuild=True /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:PackageLocation=”$(build.artifactstagingdirectory)”‘
platform: ‘Any CPU’
configuration: Release
clean: true/

//copy .dacpac file to taraget folder

– task: CopyFiles@2
displayName: ‘Copy Files to: Demo.Database’
SourceFolder: Demo.Database
Contents: ‘**\*.dacpac’
TargetFolder: DatabasePublishFiles

//publish the artifact to destination folder

– task: PublishBuildArtifacts@1
displayName: ‘Publish Artifact: Database’
PathtoPublish: DatabasePublishFiles
ArtifactName: Database
Once the build is triggered and finished, the artifacts with .dacpac file will get generated in target destination.

Release definition

Navigate to Pipelines > Releases and create a new empty Release pipeline.


Set Artifacts

Add the artifacts – choose Source as the build pipeline producing the artifact.


Add the task by selecting the Azure Database Deployment.

Add the required field in the template.

Setting up Environments

This way you can set up the task for individual environment or clone the existing task.


You’re done!

Database Continuous Integration and Delivery (CI/CD) using Azure devops – Part 1

Using SSDT, we can create a database project and implement schema changes by adding, modifying or deleting the definitions of objects (represented by scripts) in the project. This makes your database code, procedures to be under source controlled and can be managed to deploy through CI/CD pipeline.

Output of the SSDT project is a DACPAC file. DACPAC is Data Tier Application Package. Its a logical database management entity that contains database model like tables, views, and instance objects. Generated DACPAC file can be deployed to multiple environments.

I will explain the continuous integration and continuous delivery of the database project.

Creating the Project

  • Create a new SQL Server Database project in Visual Studio

create project

  • Import the existing database.

  • Provide the database server details

  • Leave the other settings as they are and click start.


  • After a few seconds the import should be complete.


  • The database tables and procedures are successfully imported.

  • Once you build the project, you will able to see .dacpac file created in bin folder.

There is Database Continuous Integration and Delivery (CI/CD) using Azure devops – Part 2

Sitecore Admin Pages (Sitecore 8)

Sitecore come with various admin pages that provides significant features to developers and administrators.

Some of admin pages are not so popular so, I intend to identify and compile a quick list of Handy admin URL’s for Sitecore.


This page display the final rendered version of web config for sitecore section. This also use config files to the /App_Config/Include/ folder and merge those with web.config. The Show Config tools displays the final rendered version of all included configs.



This page display the caching configured for instance. This includes prefetch cache, data cache, item cache, HTML cache as well as specifically defined cache. This will give you the information about the current in-use cache levels and the defined maximum thresholds. It allows any users to instantly clear the Sitecore caches.



This page display the view of sitecore database and file system. This is similar to the content tree itself, however it appears to be a less intensive interface that only loads sub-items when parent items are selected. This provides the option to delete all of the item’s children – something which is not available from the content editor itself without writing custom code.



This page allows developer to quickly create huge amounts of content based on a number of variables that are configurable in the tool. This configuration is disabled by default and can be enabled using EnableFillDB setting in the /App_Config/Include/Sitecore.Buckets.config file.

<setting name=”EnableFillDB” value=”true” />

On a first run, you’ll need to run all steps. After the initial setup only the last 3 steps are needed to be ran i.e 4) Clear site caches 5) Generate items 6) Rebuild Index(es).



The page will allow you to run LINQ queries against your indexes all within the browser. Not only this but you can play around with different POCO’s(Plain Old CSharp Object), settings and also test the performance of your queries.



This page servers as default login page. If you are unauthorized to access any admin pages , you will be redirected to this login page.The best thing is that it uses the “returnUrl” querystring parameter and allow you to access same page after authentication.



This page helps to generate Media URL with dynamic image scaling properties. This functionality appends the hash value after the media URL.



Pipelines separate the implementation of processes into a series of discrete processors, each typically responsible for a single operation invoked by one or more components within the system.

This page displays pipeline/processor name, executions, wall time, time/execution.

Pipeline Profiling is disabled by default. To enable pipeline profiling, rename the /App_Config/Include/Sitecore.PipelineProfiling.config.disabled file to Sitecore.PipelineProfiling.config and set the value of the “Pipelines.Profiling.Enabled” setting to “true” if it is set to “false”.

<setting name=”Pipelines.Profiling.Enabled” set:value=”true” />

To measure CPU usage during pipeline profiling, set the value of the “Pipelines.Profiling.MeasureCpuTime” setting to “true”. Measuring CPU usage adds a performance overhead to the pipeline but provides additional information about the behavior of the processors.

<setting name=”Pipelines.Profiling.MeasureCpuTime” set:value=”true” />



This page allows developers to rebuild the Analytics reporting database. To facilitate rebuilding of analytics data, Sitecore actually requires that you have two reporting databases, so that one can still be available for reporting, while the other is rebuilding. Testing of the rebuild can then be done through this administrative screen, /sitecore/admin/RebuildReportingDB.aspx.



This page allows developers to remove the broken links to missing items in selected database. This page is helpful when content is re-structured, import or cleaned-up. This gives the option to auto serialize the items so that they can be easily restored across instances.



This page helps developers to restore Archived content back to a specified database.



This page provides an interface to serialize database content into XML. Serialization allows you to backup database content / place it into version control and create the basis for a version comparison. The serialized data will be dropped to the Data folder of the website. This tool allows developers to serialize and update the sitecore databases.



This page allows developer to change the Sitecore App Center endpoint. It is useful when dealing with Email Campaign Manager.



The page displays the rendering statistics for all registered sites. Includes load times, cache sizes, average time, last run, total items etc.  It also identifies how many times these components have been loaded from the cache.



This page is used to unlock the admin account. Administrator account would locked due to invalid attempts, then this page can be used to unlock the admin account.

This page is disable by default. To enable this page, modify the unlock_admin.aspx page and set enableUnlockButton to true.

private bool enableUnlockButton = true



This page allows developers to update the sitecore instance. It allows developers to upload .update packages and execute them over the Sitecore instance.


Happy Sitecoring 🙂