What is Azure Machine Learning?

Azure ML is an end-to-end, cloud-based, advanced/predictive analytics platform.  It caters to organizations, users, and data scientists of all skill-levels and experience. Azure ML allows users to import training data, build, train, and deploy machine learning models, and even predict outcomes and cluster data all from a simple web browser.

In addition, users can easily import training data from a number of sources including flat files, Azure BLOB storage, Hive queries, Azure SQL Databases, website lists, and more.  They can then use the simple, familiar drag-and-drop interface to create common workflow tasks for prepping data, selecting features, and training, scoring, and comparing models.

This page will explain the step-by-step process to upload the file from local machine and train a classification model with no-code AutoML using Azure Machine Learning automated ML in the Azure Machine Learning studio. 

Pre-requisite:

  • Azure Cloud Account
  • Azure Subscription and Resource Group

Step 1:

Login to Azure Cloud Portal

Enter your email id and password. Then you will be redirected to Azure Portal Page.

Step 2:

Now click on “Create a Resource”. Type “Azure Machine Learning” in the search box and select the first option.

Azure Machine Learning Dashboard will pop up. Click on “Create” Button. It will ask you to fill some information.

Workspace Name”: Name of the environment/workspace”

Subscription”: Select the subscription.

Resource Group”:  Select the Resource Group

Region”: Choose your location.

Storage Account”: Create a new Storage account.

Key Vault”: Create a new key Vault.

Application insights”: Create a new Application insights.

Container registry”: Select None

Then click on “Review and Create” button. It will review your all inputs. If all are correct, then it show check sign in each field.

Now, if all field inputs are valid, then click on “Create” button.

This will start creation of workspace and take you to the deployment page. The deployment will be in process. Wait for few seconds. Once the deployment is finished it will show the message ”Your deployment is complete”. Now click on “Go to Resource” button.

Step 3:

It will show your workspace overview page. From here you will be able to control your whole workspace. Click on “Launch Now”.

The new tab will open, it’s the workspace that we have created. There are various option on left side.

Step 4:

Create a new dataset by selecting From local files from the +Create dataset drop-down.

On the Basic info form, give your dataset a name and provide an optional description. The automated ML interface currently only supports Tabular Datasets, so the dataset type should default to Tabular.

On the Datastore and file selection form, select the default datastore that was automatically set up during your workspace creation, workspaceblobstore (Azure Blob Storage). This is where you’ll upload your data file to make it available to your workspace.

Select Upload files from the Upload drop-down.

Select Next on the bottom left, to upload it to the default container that was automatically set up during your workspace creation.

When the upload is complete, the Settings and preview form is pre-populated based on the file type.

The Schema form allows for further configuration of your data for this experiment. For this example, select the toggle switch for the day_of_week, so as to not include it. Select Next.

On the Confirm details form, verify the information matches what was previously populated on the Basic info, Datastore and file selection and Settings and preview forms.

Select Create to complete the creation of your dataset.

Select your dataset once it appears in the list.

Review the Data preview and Close.

Step 5:

After the Dataset has been registered, next step is to design the Experiments as, selecting the size of your compute environment and specifying what column you want to predict.

Select ‘Automated ML’ and Create New Automated ML run.

Select the Dataset from the list and Click NEXT. Configure run by creating New Experiment.

New Experiment Name”: Name of the experiment”

Target Column”: Select the target column.

Select Compute Type”:  Select the Compute resource as- Compute Cluster.

Select Azure ML compute cluster”: Choose your location.

Populate the Select virtual machine form to set up your compute.

Select Next to populate the Configure settings form.

Select Create to create your compute target.

On the Select task and settings form, complete the setup for your automated ML experiment by specifying the machine learning task type and configuration settings.

Select View additional configuration settings and populate the fields as follows. These settings are to better control the training job. Otherwise, defaults are applied based on experiment selection and data.

Next  Validate and test form and Click Finish.

The Run Detail screen opens with the Run status at the top as the experiment preparation begins. This status updates as the experiment progresses.

Step 6:

Once the Model has been created, we can deploy the Model as a Web Service.

Happy Coding!!!

Generate and Connect Azure Linux VM with SSH Key Pair

Connection to virtual machines in Azure can be established by using credentials or SSH. SSH (Secure Shell) is a protocol used to remotely connect to the host machine via an encrypted connection. Linux VM can be accessed with Password or SSH Key Pair.

For creating Azure VM you need to have a valid Azure account. If you do not have it, you can create using Azure Portal for free.

In this blog, I will show you how to create an SSH key and connect to Virtual Machine using SSH key.

Create SSH Key Pair on Azure

SSH consists of two keys, private and public key. The public key is placed on the Linux VM and the private key is what we present to verify our identity. Public keys are added to the authorized_keys file.

  • Open the Azure Portal
  • Search for SSH Keys
  • On the SSK Keys page, Select Create
  • Select the Subscription and Resource Group Name
  • Type the Key-Pair name
  • Review and Create
  • After Validation, Select Create
  • Select Download Private Key and create Resource
  • It will download the .pem private key file on your local machine and create the SSH Key.

Store Private key in Key Vault

Store the .pem Private key in Key Vault (downloaded as peat of SSH Key Generation).

Run the PowerShell command to store the Private key in Key Vault

$RawSecret =  Get-Content "<<Path of downloaded .pem file>>" -Raw
$SecureSecret = ConvertTo-SecureString -String $RawSecret -AsPlainText -Force
$secret = Set-AzKeyVaultSecret -VaultName "<<your vault name>>" -Name "<<name of the secret to be created>>" -SecretValue $SecureSecret

This will create a new secret in Key vault and store the Private key value.

Connect Linux VM with SSH Key Pair

  • Go to Linux Machine and Select Bastion 
  • Select SSH Private Key from Azure Key Vault
  • Type VM Username
  • Select the Subscription, azure Key Vault and Key Vault Secret
  • Click Connect

Disable Password based authentication on Linux VMs

  1. SSH into the Linux VM
  2. type “sudo nano /etc/ssh/sshd_config”
  3. Find “PasswordAuthentication” and change its value from “yes” to “no”
  4. Ctrl + X will quit the editor and you will be asked if you want to save your changes. If you do, press Y for Yes and hit enter again to write back to the same file.
  5. Restart sshd service by typing “sudo systemctl restart sshd.service”
  6. Close ssh session/window
  7. Try logging in with password again and it should not work.

Troubleshoot

  • Unable to Edit /etc/ssh/sshd_config file

Check for immutable and append permissions on file by running: lsattr /etc/ssh/sshd_config

If it’s set, you’ll see an “i” in the output.

Run chattr -i /etc/ssh/sshd_config

And confirm it by running: lsattr /etc/ssh/sshd_config

  • Unable to login to VM using SSH Private Key from Azure Key Vault

If you’re unable to login to VM using SSH Private key from Azure Key Vault, You can reset the password.

  • Go to the VM,
  • search for Reset Password,
  • And select Reset SSH public key 
  • Type Username and copy SSH public key and click Update.

Now you should be able to connect the Azure Linux VM from Azure with ssh key.

Hope this helps!!

Azure Storage

Azure Storage is a Microsoft-managed cloud service that provides storage that is highly available, secure, durable, scalable and redundant. Whether it is images, audio, video, logs, configuration files, or sensor data from an IoT array, data needs to be stored in a way that can be easily accessible for analysis purposes, and Azure Storage provides options for each one of these possible use cases.

Being a pay-as-you-go model, it gives you the flexibility of paying only for what you have used. Azure Storage services support different clients like .NET, Ruby, Java, etc. which gives developers an option.

Azure Storage allows to store both structured and unstructured data.

The very first thing you need, to use storage in azure is a storage account.

There are five types of storage available within Microsoft Azure Storage:

  • Azure Blob
  • Azure Queues
  • Azure File
  • Azure Table
  • Azure Disk

Azure Blob

  • Azure Blob is a cloud object storage.
  • It stores unstructured data. Storage is in terms of binary large objects (BLOBs).
  • Blob storage can store any type of text or binary data, such as a document, media file, or application installer.
  • Provide Streaming of Audio and video.
  • Data stored within it is accessible via REST API and package written in a different programming language.

Azure Queues

Azure Queues
  • Azure Queues are messaging Service.
  • It can store unstructured data in Buffer for the infinite time period. ( Default value of time to live is 7 days for data)
  • Maximum size of data is 64KB.
  • Data is accessible via REST API.

Azure Files

Azure Files
  • It is the managed file share in the cloud.
  • It stores unstructured data.
  • It uses SMB (Server Message Block) protocol or NFS ( Network File Share) Protocol.
  • It can be used in cloud or on-premise.
  • It can be used to replace on-premise file sharing service.
  • Data is accessible via REST API.

Azure Tables

Azure Tables
  • It is a NoSQL (Key Value Pair) database.
  • It stores structured data.
  • Data is accessible via REST API and packages written in a different programming language.
  • Allows fast access data compared to other databases.

Azure Disk

Azure Disk
  • It is block-level cloud storage ( Emulation of Physical hard Drive ).
  • It has 3 options Premimium SSD ( Solid State Drive ), Standard SSD ( Solid State Drive) and Standard HDD ( Hard Disk Drive ).
  • Facilitate upload of on-premise VMs to Azure.
  • Allows integration with Availability Set and Availability Zones.

CI/CD for React Applications using Azure Devops

The main focus of this article is to learn about CI CD setup for react application by creating build pipeline first and then creating release pipeline and creating App Service on Azure portal and deploying react based application on azure portal app service.

First step is we need a react application on Dev Azure Repository.

First of all we usually run a series of steps which are as:

  1. npm install — to install node_modules locally
  2. webpack — to create bundled files

Let’s check with my package.json where i have set up npm script to run webpack command

npm run build command will execute webpack command and generate bundled files in dist folder. Dist folder will have two files. one is bundled js file and other is index.html which will be generated in dist folder

Now lets create Build Pipeline by going to Pipelines section in left and select the template of Node.js with gulp option. It will come up with predefined tasks which we can change as per our requirement.

First select node task , where we will run npm install command

Next we will remove Run gulp task and add another node task from right side

Click on npm Add Task button and it will display below first task.

Change the display name from npm install to npm run build and change command from install to custom option. It will display the command and arguments input box, please type npm script command given in package.json and it to ‘run build’ command

Now move to next step to archive files. Here we will tell from which folder to archive files.

Change the root folder by clicking three dots on right side and select path to dist folder as it will be folder where index_bundle.js file will be created and index.html will be copied by previous task.

Now move to publish artifacts task and select the path from where to pick files and define artifact name which will be used in release pipeline

Now save and queue build and see if your build succeed or fails.

We can manually run this script by clicking on run or we can setup auto trigger by selecting option to enable auto-trigger Build Pipeline whenever code is pushed to specific branch.

After you receive the success notification of build pipeline. We are done with Continuous Integration Step of DevOps.

Now lets move to Continuous Deployment Step of code.

Lets click on Add New Release Pipeline option.

After clicking on adding release pipeline, we are presented with a list of options as to which type of release pipeline we want to create. Select Azure App Service deployment option. Now we can edit stage name.

After editing stage name, click on 1 job, 1 task option and we will be presented with a screen as below

Here we need to supply Azure Subscription and App Service Name.

And Click on Manage link. It will show a list of service connections if already created and a link to create a new service connection.


Here is the post on How to Create Service Connection.

Now add the details for Azure Subscription and app Service name.

Now click on save and go back to release pipeline step and click on add an artifact

There we will select source build pipeline name as previously created to use its artifact.

Now click on Add to add this artifact to this release pipeline.

We are done with creating our release pipeline. Now can create a new release and trigger it to deploy React App on Azure Portal App Service Url.

Now we are done with creating release pipeline with 1 job and 1 artifact .

You can run this release by clicking on Deploy button and it will push react application to Azure Portal.


Once release is successfully deployed. We can see react app deployed on azure portal.

GitHub Branching Strategy

Context

I would be using Gitflow branching model as starting point. This is a common strategy used by enterprises but should be reviewed in future to find a flexible and collaborative way of sharing code in a consistent manner depending on the team’s size and requirements.

How it works:

This workflow mainly uses two branches (“master” & “develop”) to record the history of the project.

  • master:is considered to be the main branch where the source code of HEAD always reflects a production-ready state. It’s also convenient to tag all commits in the master branch with a version number.
  • Develop: is considered to be the main branch where the source code of HEAD always reflects a state with the latest delivered development changes for the next release. This is also called the “integration branch”

Screenshot 2020-03-18 at 12.20.20

When the source code in the develop branch reaches a stable point and is ready to be released, all of the changes should be merged back into master and tagged with a release number

Supporting Branches

Apart from the main branches master & develop, the following branches can be used to aid parallel development between team members based on the requirements.

  • feature
  • release
  • hotfix

Feature Branch

This branch is used to develop any new features for the upcoming or a distant future release. The essence of the feature branch is that it exists as long as the feature is in development but will eventually be merged back into develop branch.

Capture

Release Branch

This branch is used to support preparation of a new production release.

Capture

Hotfix Branch

This branch is used to patch production releases. Hotfix branches are a lot like release and feature branches except they’re based on master instead of develop.

Capture

Basic Git Commands

Description Command
1 Clone a repo located at <repo_url>. git clone <repo_url>
2 Fetch/pull the changes from remote git pull
3 Stage all changes in <path> for the next commit. git add <path>
4 Commit the staged snapshot git commit -m “”
5 List which files are staged, unstaged, and untracked. git status
6 Show the difference/changes between your index and working directory. git diff
7 Upload/push local repo changes to remote git push
8 Create a new branch git checkout -b <branch name>
9 Switch to an existing branch git checkout <branch name>
10 List branches git branch     (list local branch)
git branch -r (all remote branches)
11 Create tag git tag <tag>
12 List tags git tag
13 Push tags git push –tags

Reference:

https://docs.microsoft.com/en-us/azure/devops/learn/devops-at-microsoft/release-flow

Service Connections on Azure DevOps

You need to connect to external or remote services to execute tasks in a job. E.g.  you need to connect to your Microsoft Azure subscription through Azure devops to deploy the resources to a subscription.

You can define service connections in Azure Pipelines that are available for use in all your tasks. For example, you can create a service connection for your Azure subscription and use this service connection name in an Azure Web Site Deployment task in a release pipeline.

Once you create a new release pipeline in Azure devops, (detailed in my previous post) you will be asked to select the targeted Azure subscription containing the Azure App Service resources. From the drop down you will get a list with all the available Azure subscriptions that you can access. When you select the desired Azure subscription you get asked to authorize to configure an Azure service connection:

shph3dlb05g78yk3dyg9

And when you  hit the Authorize button I get the following error:

bf97ds1sw7q5g1gw98o4

This can be fixed by manually creating the Azure service connection using an Azure App Registration.

Create the Azure App Registration

Starting with the Azure Portal account, let’s create an app registration to represent our connection to Azure AD.

In Azure portal within the Azure Active Directory go to the App registrations tab and create a new registration. Provide a display name (e.g. Azure DevOps Connection) and Register the app.

Create a new client secret a.k.a. application password that we can use later in the setup of the Azure service connection in Azure DevOps.

5rfo5d0l5xxwzycqou3k

Then we need to assign that registration to the subscription containing the resources we want via Access Control.

Provide access to the Azure App Registration

Navigate to the resource group containing the Azure App Services that will be used for the deployment.

Capture

In the Access control panel add a new role assignment:

mqdgzteh4wcm3dxwdw5i

We are now ready to manually create the Azure service connection.

Creating Service Connection in Azure DevOps

When an application needs access to deploy or configure resources through ARM or VSTS in Azure, you’ll need to create a service principal, which is a credential for your application.

Now using the Azure DevOps account in Azure DevOps:

Capture

Provide the details for your Connection name, Subscription ID, Subscription Name and Tenant ID.

2019-02-07-01

Finally verify and save the service connection and you are good to go!

And that’s done! Now your pipelines can access the Azure resources.

Happy deploying!

CI/CD with Azure Devops for ARM Templates

Overview

In my previous post, we learned how to create ARM template and deploy ARM template using Visual Studio. In this blog post, I’ll explain how to use Azure CI/CD pipelines to provide an end-to-end automation experience to users when deploying an ARM Template via Azure DevOps.

Azure pipeline as a core part of Azure DevOps, it allows for the creation of CI ( Continuous Integration) pipeline in a declarative way using YAML documents, it is also called build pipelines. This capability is also extended to CD ( Continuous Delivery ) Pipelines which is also known as Release Pipelines. It is possible to define multi-stage pipelines-as-code for both Continuous Integration and Continuous Delivery with the same YAML definition file.

Since Github can be easily integrated with Azure DevOps,  you can not only build your CI/CD pipeline based on your source code on Github, but also even Mapping your GitHub repository permissions with Azure DevOps.

Solution Overview

In order to use Azure DevOps Build, your Azure DevOps project must contain source code for the application. For this lab, we are using the Azure DevOps project with a Git repository, which I already explained that in my previous post.

Prerequisites

Step 1: Get an Azure subscription

You need a Azure account : Create an Azure account by browsing to https://azure.microsoft.com/en-us/free/

Step 2: Get an Azure DevOps account and create organisation and project. You can check Quickstart: Create an organization or project collection to know more details.

Step 3: Get source from Github repository.

Define service connections

Once you’ve all the prerequisites, it’s time to do some DevOps magic!  You will typically  need to connect to Azure or any other external services to execute tasks in a job.

There is a separate post on How to create Service Connections on Azure DevOps.

Set up the build pipeline

Build is used to pack up or actually build the application and create the artifact out of that.

In Azure DevOps, my folder structure looks like:

Capture

Now, Go to Pipeline and build New Pipeline. Click on “Use the Classic Editor”. It will show the options to select your repo. By default, my Az-ResourceGroup Creation repo is already selected from Azure Repos Git. You can change it to any other source control as well, whatever you’re using for your project.

Capture

From the new page shown, we’ll select the ’empty process’ link so that there are no tasks pre-configured for us:

Capture

Once your empty build pipeline is ready, you can add tasks in the build.

Capture

Task in the Build

Now, we need to first add a task so that we can generate the artifacts. So, we need to click add tasks from left pane and then search for tasks:

ARM template validation

Choose Azure Resource Group Deployment. Click Add.

This step can be used to deploy to Azure and you can set it up.

Display Name and Azure subscription
Give the deployment a name and connect it to you azure subscription. It might ask you to authenticate.

Action
Use default action: Create or update resource group.

Resource group and location
For the resource group you can use a drop-down to an existing resource group, or you can type one down. It will be created then. Set up a location to your preference. I’m using all those values as variables and values are assigned under Library section.

tempsnip

Template, Template Parameters, Deployment mode
Next to the Template-area, you can click the three dots to select your template in the repository. Do the same for the parameter file. Set the Deployment mode to Validation only. This way nothing will be deployed, only tests will be run (it will create the resource group if it doesn’t exist though).

Capture

You can now save this step and move on to the next step.

Publish artifacts

Now, we need to create the artifacts, so that It can be available in release Pipeline for access to the files.

Add the step Publish Build Artifact to achieve this.

Change path to publish, to only make the deployment files available. Click the three dots and select the folder.

Capture

The build pipeline will look like this:Capture

 

That’s it. You can now use Save & queue to run the build immediately or just save, which will make the build available for new commits. We now have a ready build definition for CI purposes.

Set up the Release pipeline

After the build, it’s time to set up the release to deploy directly to Azure. The general guideline is to create a release definition, link it to build and then create a CD pipeline.

Go to your repository and click succeeded in the top right. When you do, a button will become available saying “set up release”

Click the button and a new window will pop up. You can select a template, similar to the build-steps. Again, choose empty job. Change the Stage Name to “deploy to Azure” and close the stage-Window with the x in the top right corner.

Click on the Name of the release pipeline at the top of the screen to make it more descriptive.

Capture

Artifact

Click add an artifact. The screen will default to a Build as a Source type. In the drop-down, select the build we have just set up. Specify for the release to use the latest build. Keep the other settings as they are.

Capture

Tasks

Now click “1 job, 0 tasks” at the deploy to azure-stage. Here you can add steps like you did for the Build. Create a new Azure Resource Group Deployment-task

At this step, you can repeat the settings for the subscription and resource group that you used in the Build.

When you set up the Template and parameters though, you need to change these to the files that were created by the build. For this, you can again use the three dots to navigate to them.

The most important difference here is the deployment mode. Here you set it to incremental or complete.
Which you choose depends on your needs. I leave it to incremental as I don’t want to override excising resources.

Capture

Trigger the Release

Once you have added the steps, you can trigger the release, which will deploy all the components on the Azure Portal.

tempsnip

Once its complete, we can see the resources deployed into the Azure Portal.

That’s it.

Infrastructure as Code- Azure ARM Templates

One of the buzz words in the DevOps words is this “Infrastructure as Code” thing, and the Azure ARM Templates are a good example of how it would look like.

Azure Resource Manager (ARM) allows users to deploy applications to Azure using declarative JSON templates. The template will be used to deploy your components to different environments such as development, testing, staging, and production.

This article will help to get the answers of following:

  • Do you have a complex Azure Infrastructure ?
  • Do you need to version your infrastructure and keep a history  for tracking?
  • Do you need to stop worrying about the deployment of your infrastructure and make it automated ?

Answer of all above questions are ARM Templates, Github(any version control system) and Azure DevOps.

Using an ARM template allows to specify all of the host infrastructure details in code, which makes creating infrastructure much easier, and more repeatable and stable.

Create an ARM template

There are different ways to create the ARM templates.

If you are unfamiliar with json and ARM templates I recommend you to read Microsoft official documentation and getting started guide which will show how you can find the template from an existing deployment or create a new.

For this demo, I will use the Visual Studio 2019 to create ARM template.

If you don’t have Visual Studio installed already you can download and install it from here. While installing the Visual Studio, make sure you check the Azure development section like the below picture.

If you already have Visual Studio and you did not select this section, you can do it by going to windows “Add/Remove programs” and modify your Visual Studio instance and add this in.

Before starting with Visual Studio 2019, make sure you got the correct package installed for Azure development – I’ve included a screenshot of the “Azure development” pack I needed to install in my “Visual Studio Updater”.

vs azure

Now open Visual Studio, and select File > New Project and under the C# branch, select “Cloud”, and “Azure Resource Group” project template.

arg picture

This template creates an Azure Resource Group deployment project. The deployment project will contain artifacts needed to provision Azure resources using Azure Resource Manager that will create an environment for your application.

Next, Visual Studio 2019 gives the option to select what kind of ARM template I want to use. There are a bunch of useful templates available, and to create an Azure App Service, I’ve selected the “Web app” template, as shown below. You can start blank or select from numerous templates available from Github.

select azure template

After click OK on that screen, the ARM project is created.

It creates 3 files in the solution.

arm

  1. A powershell script that can be used/run using Azure CLI or Azure devops to deploy the Resource Group based on the template
  2. An ARM Template that can deploy an Azure Web app on Azure Portal.
  3. An ARM Template Parameters that is used to define any variables that are used in the main website ARM template.

Deploy an ARM template

ARM Template can be deployed by 2 ways:

  1. Using Visual Studio deploy.

       It is quite simple. All we need to do is:

  • Open the Website.parametes.json file
  • Update all the values that are missing and save it.
  • Right click on the project and click Deploy.
  • Select your subscription and done!

Move/Push the code to GitHub.

2. Using Build/deploy Tool – I’ll be covering this in my next post. Here is the post.

Happy Coding!!

 

Azure : can’t access KeyVault

Context: as a tenant administrator you can’t inspect KeyVault contents created by other people (“you are unauthorized to view these contents”).

Solution: Assign yourself appropriate permissions like this:

Locate the KeyVault in the Azure portal:

  1. go to “Access policies”
  2. click “+Add Access Policy”
  3. Key permissions: everything under “Key Management Operations” and “Cryptographic Operations”
  4. Accordingly for key and certificate permissions
  5. Select yourself as principal
  6. Leave the Authorized application empty
  7. Add
  8. Save

Getting started with Terraform

I’m becoming a fan of Terraform, when I started using at my work to manage our Azure environments and using it more recently with Azure Devops. Terraform is an infrastructure-as-code tool that allows to create and manage cloud based infrastructure in codified way. It is an open source tool that codifies APIs into declarative configuration files (*tf) that can be source control to version the changes.

Infrastructure as Code (IaC) is provisioning and managing computing infrastructure with declarative approach that can be source control to version the changes that you want to make your infrastructure.  Infrastructure-as-code, is a modern approach to managing infrastructure, and is called the “foundation for DevOps”.

Advantages

Using Infrastructure as Code as part of your deployment process has a number of benefits to your workflow:

  • Consistency – The IoC will be consistent through source code management. This is good when it comes to rolling back if mistakes are made.
  • Speed – Creating infrastructure through declarative approach will be faster while comparing to manual navigation through interface.
  • Reusability –  Existing code can be reused for provisioning and deployment across multiple platforms.
  • Extensibility – Existing code can be extended for provisioning and deployment.

Terraform uses the concept of “providers”, which are like plugins which extend the capabilities of the tool, to interact with various cloud systems in a number of different ways.

Installing Terraform

I’m using Windows 64-bit.

Terraform is packed as zip archive. You can download it using chocolatey,  which is good too.

  • After downloading terraform, unzip the package.

  • Copy files from the zip to “c:\terraform” for example. That’s my terraform PATH. The PATH is the system variable that operating system uses to locate needed executables from the command line or Terminal window.

For Windows 8 

  • Search for : System (Control panel)

  • Click the Advanced system settings link.

  • Click Environment Variables.

  • In the section System Variables, find the PATH environment variable and select it. Click Edit. If the PATH environment variable does not exist, click New.

  • In the Edit System Variable window, append at the end of the PATH environment variable the value of terraform path as ”c:\terraform;”

  • Click OK, then OK, then OK

Verify Install

Once you have configured the terraform, you should be able to run terraform commands. Open up command prompt and type terraform. You should see the terraform usage appear.

Terraform Extension with Visual Studio Code

  • Once Installer is complete and can be opened from the Start Menu.
  • To install extensions Press F1 and start typing ext or (Ctrl + Shift + X)

  • Choose Extensions: Install Extensions
  • Install Terraform and Azure Terraform extensions

Now you’re ready to write some code!

Happy Coding!