Archive for June, 2018

Azure Key Vault

Posted: June 13, 2018 in Azure

In Azure Key Vault,which is a secure secrets store, we can store passwords, connection strings, and other pieces of information that are needed to keep your applications working. You want to make sure that this information is available but that it is secured.Key Vault allows you to create multiple secure containers, called vaults. These vaults are backed by hardware security modules (HSMs).

To create key vault in create resource type key vault-Create

 

1.PNG

Give it a name,specify resource group and location and click create

 

2.PNG

Once vault is created click on secrets to add a new secret

 

3.PNG

In this example i stored Storage account keys in vault-first copied storage account keys

4.PNG

Then pasted it into vault,optionally, Activation and expiration dates can be specified.

5

Now, we need to point our application to this Key Vault.I’m not a developer so i created some fake (web) application,for demonstration purpose

Azure Active Directory-App registration-New application registration

 

6.png

Give application name and specify URL

 

7.PNG

Once application is created, go to it’s properties and click Keys

 

8.PNG

Create a key,specify expiration period

 

9.PNG

Copy key to clipboard, you will use it in your code to connect to Key Vault

 

10.PNG

Now we need to create Key Vault access policy-go to resource group-locate Key Vault-click access policy

11.png

Add new

 

12.PNG

Select your application we created earlier-click on Select principal

 

13.png

Select action application can perform against vault-in this case it can only get secret key

 

14.PNG

Now web application can get storage key from Key Vault

Advertisements

Installing SQL database in Azure

Posted: June 10, 2018 in Azure

From Azure portal Click SQL Databases-new SQL Database-click on Server-Create new server-enter server name and credentials

 

1.PNG

Select Pricing Tier

1-2.PNG

Select tier

 

1-3.PNG

 

After database is created go to Connection String to see string for connecting to database

 

2.PNG

Connecting to the database

in DB settings click Query editor (preview)-Login

 

5.png

Enter credentials

 

6

Enter some T-SQL query and click Run

 

7.PNG

Firewall settings

On SQL database click Overview-Set server firewall

8.png

Add client IP-rule for allowing connection from Your local machine will be allowed to the SQL database

9.png

Configuring Geo-Replication

Active geo-replication enables configuring up to four readable secondary databases in the same or different data center locations (regions). Secondary databases are available for querying and for failover if there is a data center outage or the inability to connect to the primary database.On database settings click Geo-Replication, with blue check mark is selected current database region, click on any green circle where you want to create read-only database copy

10.PNG

 

Next windows will appear

11.PNG

 

SQLServer security settings

On SQL server settings click Advanced Threat Protection-Enable Advanced Threat Protection-click On-save

15.PNG

Click on storage details to configure Retentions

 

16.PNG

Click Threat Detection Type to select what you want to audit

 

17.PNG

Masking data

If you want to hide some database column, on database settings click Dynamic Data Masking

18.PNG

 

19.PNG

Creating Elastic Pool

It’s technique where you can place multiple Azure databases in to a pool where they all share resources. The pool is configured to over a maximum and minimum amount of computing resources. Microsoft has defined these available resources as a Database Throughput Unit, or DTU. In the case of elastic pools, they are defined as an Elastic Database Throughput Unit, or eDTU. DTUs and eDTUs are calculated essentially the same and are determined through a calculation of measured disk reads, disk writes, processor time and transaction log flushes.

Again, click on the SQL database

 

20.png

From overview click New Pool

 

21.png

 

22.PNG

 

23.PNG

Adding database to Pool

 

Once pools is created,click Configure

24

Click Database

 

25.PNG

select database

26.PNG

 

 

 

 

 

 

 

 

Azure managed service identity enables application code connection to storage account without providing credentials.More info here

Click on Azure VM-Configuration-Managed service identity-Click Yes-save

 

1

As a result,new extension will be created

 

2.PNG

Enabling Azure VM access to storage account

 

Click on storage account-Access control (IAM)

3.PNG

Role:Storage Account Key Operator Service Role

Assign access to:Virtual Machine

Subscription:Your subscription

Resource group:select resource group and select VM

 

4.PNG

 

In one of previous posts we created JIRA subtasks using REST API, in this example we’ll see how to create new JIRA task with Epic link,label, assignee and reporter

Bash:

curl -D- -u user:pass -X POST --data "{\"fields\":{\"labels\":[\"SERVICES\"],\"assignee\":{\"name\":\"emergencyadmin\"},\"reporter\":{\"name\":\"user\"},\"project\":{\"key\":\"AA\"},\"summary\":\"Create user account in Local AD.\",\"description\":\"Create user account in Local AD.\",\"issuetype\":{\"name\":\"Managed Service\"},\"customfield_10107\":{\"id\":\"10505\"},\"customfield_10006\":{\"CP-3289\"}}}" -H "Content-Type:application/json" https://jira.company.com/rest/api/latest/issue/

customfield_10006-epic link
customfield_10107-client account

Python:

 

#!/usr/bin/python3

import sys
import json
import requests
import os
import urllib2
import argparse
from getpass import getpass
from json_tricks import dump,dumps

parser = argparse.ArgumentParser()
parser.add_argument('-pass', '-password', dest='password', help='svc-rundeck password.')
args = parser.parse_args()


username = 'svc'
password = args.password



def create_subtask(summary, description, key, line, user, password):
      headers = {"Content-Type": "application/json"}
      data = {"fields": {"project": {"key": key},"parent": {"key": line},"summary":summary, "description": description,
     "issuetype": {"name": "Sub-task"},"customfield_10107": {"id": "10400"}}}
      response = requests.post("https://jira.corp.company.com/rest/api/latest/issue/",
      headers=headers,data=json.dumps(data),auth=(user, password))
      out = response.json()
      print (out)

headers = {
    'Content-Type': 'application/json',
}

params = (
    ('jql', 'project="Technology" AND summary~"New User*" AND issuetype="Task" AND status!="DONE"'),
)

response = requests.get('https://jira.corp.company.com/rest/api/2/search', headers=headers, params=params, auth=(username, password))

data = response.json()

for issue in data['issues']:
    if len(issue['fields']['subtasks']) == 0:
         line = issue['key']

         #1

         create_subtask("Create user account in Local AD.", "Create user account in Local AD.", "TECH", line, username, password)

In previous post we created AD synchronization with Office 365, in this one we’ll see how we can manage it.

Using Synchronization service we can add/exclude Organization Units to/from AD synchronization

1.PNG

Click Connection-properties

2.png

Containers

3.png

Enter domain admin credentials

4.PNG

Choose what will be synchronized

5.PNG

To manual start synchronization,from connection select connection-right click-Run

6.png

To monitor synchronization open Synchronization Rules Editor

7.png

8.PNG

Similar tool in Office365 portal is DirSync manager

9.PNG

10.PNG

PowerShell commands

Import-Module "C:\Program Files\Microsoft Azure AD Sync\Bin\ADSync\ADSync"

Get synchronization schedule:

Get-ADSyncscheduler

PS C:\Users\Administrator> Get-ADSyncscheduler

AllowedSyncCycleInterval : 00:30:00
CurrentlyEffectiveSyncCycleInterval : 00:30:00
CustomizedSyncCycleInterval :
NextSyncCyclePolicyType : Delta
NextSyncCycleStartTimeInUTC : 5/30/2018 8:04:05 PM
PurgeRunHistoryInterval : 7.00:00:00
SyncCycleEnabled : True
MaintenanceEnabled : True
StagingModeEnabled : False
SchedulerSuspended : False
SyncCycleInProgress : False

Start synchronization manually:

initial: full synchronization
delta:synchronize changes since last full synchronization

Start-ADSyncSyncCycle -policy initial

PS C:\Users\Administrator> Start-ADSyncSyncCycle

Result
------
Success

Enable scheduled synchronization

Set-ADSyncScheduler -SyncCycleEnabled $true

 

Azure User Defined route

Posted: June 5, 2018 in Azure

Azure automatically creates system routes and assigns the routes to each subnet in a virtual network. We can’t create system routes, nor can you remove system routes, but you can override some system routes with User Defined Route

In this example i created 3 subnets

 

3.PNG

 

Created 3 VM’s and assigned subnets above

2.PNG

 

Traffic from appservers subnet will go to webserver subnet (and vice-versa) will go through firewall machine, not directly.We’ll achieve it using User Defined Route.

From Azure portal click Create resource-Route Table

4.PNG

 

Give it name and resource group and click create-then click on Routes-Add

5.png

In this example i set a route to 10.0.1.0/24 through 10.0.4.4 (firewall VM)

 

5.png

Now go to network interface of firewall machine (networking-network interface)

5.png

IP Configuration-IP forwarding settings-Enabled

 

5.png

On same machine (firewall) Install Routing and Remote Access

 

222.png

 

 

223

 

224.png

 

225.png

 

226.png

From web server try reaching app server.traffic goes through firewall

 

1.PNG

 

Login-AzureRmAccount
Set-AzureRmContext -SubscriptionId "subscriptionid"

$accountKeys = Get-AzureRmStorageAccountKey -ResourceGroupName "test" -Name "mystorageaccount201806"

#The Storage Context object itself is what enables us to authenticate to the Azure Storage REST API from PowerShell.

$storageContext = New-AzureStorageContext -StorageAccountName "mystorageaccount201806" -StorageAccountKey $accountKeys[0].Value

#SAS key expiry time

$expiryTime = (get-date).AddYears(1)

#set perimissions:r-readmw-write-l-list

$permission = "rwl"

#Create policy

New-AzureStorageContainerStoredAccessPolicy -Context $storageContext -Container "test" -Policy "test" -ExpiryTime $expiryTime -Permission $permission

#Get token

$sasToken = New-AzureStorageContainerSASToken -Name "test" -Policy "test" -Context $storageContext
$sasToken = $sasToken.substring(1)

#Write-Host "SAS token (ref shared access policy): $sasToken"

$sasToken2 = New-AzureStorageContainerSASToken -Context $storageContext -Container tibp-userprofiles -Permission rwl
#Write-Host 'SAS token: ' $($sasToken2)