Archive for the ‘Scripts’ Category

On SQL server create stored procedure for collecting Job status:

CREATE PROCEDURE [dbo].[usp_job_server_error] @job_name VARCHAR(100) AS
BEGIN
SET NOCOUNT ON
SELECT j.NAME AS '[JOB]',
CASE
WHEN jh.run_statusIN ( 0, 1, 2, 3, 4 ) THEN jh.run_status
ELSE ( CASE
WHEN ja.run_requested_dateIS NOT NULL
AND ja.stop_execution_dateIS NULL THEN 4
ELSE -1
END ) END  AS '[STATUS]',
ja.run_requested_date AS '[LAST_EXECUTION]'
WHERE  ja.session_id = (SELECT Max(session_id)
AND j.enabled = 1
AND j.name = ISNULL(@job_name, j.name)
--AND (j.name LIKE ISNULL(@identifier, 'HIGH')+'%' OR j.name LIKE ISNULL(@identifier, 'DISASTER')+'%')
SET NOCOUNT OFF
END;
GO

Script for getting data from stored procedure and sending it to Zabbix:

 

$data = $(foreach ($line in sqlcmd -Q "exec dbo.usp_job_server_error @job_name=sql job" -E -s ":") { $l=$line.split()[0]; $s=$line.split(":")[1] ; "$s"})
$result=$data[2..$data.length]
cd "C:\Program Files\Zabbix Agent\bin\win64"
.\zabbix_sender.exe-z zabbix_server -p 10051 -s zabbix_host -c "C:\Program Files\Zabbix Agent\conf\zabbix_agentd.win.conf" -k sql.job[myjob]-o $result -vv
Create Zabbix item:
5.PNG
And trigger:
{server:sql.job[myjob].last()}=0
Schedule above script with Task Scheduler:

Program/script: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe

Add argument: -file “C:\File\Scripts\jobstatus.ps1”

Start In: C:\File\Scripts

 

Advertisements

This code will list all EC2 instances for every region and if termination protection is enabled, it will be disabled

import json
import boto3
def lambda_handler(event, context):
client = boto3.client('ec2')
ec2_regions = [region['RegionName'] for region in client.describe_regions()['Regions']]
for region in ec2_regions:
client = boto3.client('ec2', region_name=region)
conn = boto3.resource('ec2',region_name=region)
instances = conn.instances.filter()
for instance in instances:
#if instance.state["Name"] == "running":
#print instance.id # , instance.instance_type, region)
terminate_protection=client.describe_instance_attribute(InstanceId =instance.id,Attribute = 'disableApiTermination')
protection_value=(terminate_protection['DisableApiTermination']['Value'])
if protection_value == True:
client.modify_instance_attribute(InstanceId=instance.id,Attribute="disableApiTermination",Value= "False" )

 

If You need to get members of particular Azure AD role use below script:

 

connect-azuread
#get all groups
Get-AzureADDirectoryRole | select displayname

$role = Get-AzureADDirectoryRole | Where-Object {$_.displayName -eq 'Company Administrator'}

Get-AzureADDirectoryRoleMember -ObjectId $role.ObjectId | Get-AzureADUser | select displayname,user
principalname | Export-Csv "C:\Users\lap-top\Downloads\1.csv" -NoType

Requirements:

  • If a file is deleted in source, remove it from destination as well.
  • If a file is deleted form destination, do not remove from source.
  • if a file is already in source and destination do not do anything
  • if a file is on source but not destination, copy to destination
robocopy "\\source" "destination" /r:60 /w:5 /PURGE /MIR /MT:64

r:60 – retry 60 times

w:5 wait 5 seconds between retries

/PURGE: delete from destination if file is not in source

/MIR MIRror a directory tree

/Z : copy files in restartable mode

If we use /Z (restartable mode) the transfer bandwidth is about 4 to 6Mbps.

If  we take off the /Z switch, it goes between 80-120Mbs

and we need to add /MT:64

/MT[:n] :: Do multi-threaded copies with n threads (default 8).

This way the “file in use” error should be eliminated since Robocopy will have enough time between the scheduled run times to copy even the largest files ~6GB

lThis script performs following:

Untitled.png


import boto3
import collections
import datetime
import time
import sys 

ec = boto3.client('ec2', 'eu-west-1')
ec2 = boto3.resource('ec2', 'eu-west-1')
from datetime import datetime
from dateutil.relativedelta import relativedelta

#create date variables 

date_after_month = datetime.now()+ relativedelta(days=7)
#date_after_month.strftime('%d/%m/%Y')
today=datetime.now().strftime('%d/%m/%Y')

def lambda_handler(event, context):
  #Get instances with Owner Taggs and values Unknown/known
    instance_ids = []
    reservations = ec.describe_instances().get('Reservations', []) 

    for reservation in reservations:
     for instance in reservation['Instances']:
        tags = {}
        for tag in instance['Tags']:
            tags[tag['Key']] = tag['Value']
        if not 'Owner' in tags or tags['Owner']=='unknown' or tags['Owner']=='Unknown':
              instance_ids.append(instance['InstanceId'])  

                #Check if "TerminateOn" tag exists:

              if 'TerminateOn' in tags:
                  #compare TerminteOn value with current date
                    if tags["TerminateOn"]==today:

                    #Check if termination protection is enabled
                     terminate_protection=ec.describe_instance_attribute(InstanceId =instance['InstanceId'] ,Attribute = 'disableApiTermination')
                     protection_value=(terminate_protection['DisableApiTermination']['Value'])
                     #if enabled disable it
                     if protection_value == True:
                        ec.modify_instance_attribute(InstanceId=instance['InstanceId'],Attribute="disableApiTermination",Value= "False" )
                    #terminate instance
                     ec.terminate_instances(InstanceIds=instance_ids)
                     print "terminated" + str(instance_ids)
                     #send email that instance is terminated

                    else:
                    #Send an email to engineering that this instance will be removed X amount of days (calculate the date based on today's date and the termination date."

                      now=datetime.now()
                      future=tags["TerminateOn"]
                      TerminateOn = datetime.strptime(future, "%d/%m/%Y")
                      days= (TerminateOn-now).days
                      print str(instance_ids) +  " will be removed in "+ str(days) + " days"

              else:
                 if not 'TerminateOn' in tags:#, create it
                  ec2.create_tags(Resources=instance_ids,Tags=[{'Key':'TerminateOn','Value':date_after_month.strftime('%d/%m/%Y')}])
                  ec.stop_instances(InstanceIds=instance_ids)

                  print "was shut down "+format(','.join(instance_ids))

In one of previous posts we created JIRA subtasks using REST API, in this example we’ll see how to create new JIRA task with Epic link,label, assignee and reporter

Bash:

curl -D- -u user:pass -X POST --data "{\"fields\":{\"labels\":[\"SERVICES\"],\"assignee\":{\"name\":\"emergencyadmin\"},\"reporter\":{\"name\":\"user\"},\"project\":{\"key\":\"AA\"},\"summary\":\"Create user account in Local AD.\",\"description\":\"Create user account in Local AD.\",\"issuetype\":{\"name\":\"Managed Service\"},\"customfield_10107\":{\"id\":\"10505\"},\"customfield_10006\":{\"CP-3289\"}}}" -H "Content-Type:application/json" https://jira.company.com/rest/api/latest/issue/

customfield_10006-epic link
customfield_10107-client account

Python:

 

#!/usr/bin/python

import sys
import json
import re
import requests
import subprocess
import os
import urllib2
import argparse

import datetime
from dateutil.relativedelta import *

one_month_ago = datetime.datetime.now() – relativedelta(months=1)

previous_month = one_month_ago.strftime(“%B”)

currentYear = datetime.datetime.now().year

password = str(sys.argv[1])

headers = {“Content-Type”: “application/json”}
data = {“fields”:{“labels”:[“SERVICES”],”reporter”:{“name”:”user”},”assignee”:{“name”:”emergencyadmin”},”project”:{“key”:”AA”},”summary”:”SPLA usage report for {previous_month} {currentYear}”.format(**locals()),”description”:”Review SPLA usage report”,”issuetype”:{“name”:”Managed Service”},”customfield_10107″:{“id”:”10505″},”customfield_10006″:”CP-3289″}}
response = requests.post(“https://jira.company.com/rest/api/latest/issue/”,
headers=headers, data=json.dumps(data), auth=(‘user’, password))

Login-AzureRmAccount
Set-AzureRmContext -SubscriptionId "subscriptionid"

$accountKeys = Get-AzureRmStorageAccountKey -ResourceGroupName "test" -Name "mystorageaccount201806"

#The Storage Context object itself is what enables us to authenticate to the Azure Storage REST API from PowerShell.

$storageContext = New-AzureStorageContext -StorageAccountName "mystorageaccount201806" -StorageAccountKey $accountKeys[0].Value

#SAS key expiry time

$expiryTime = (get-date).AddYears(1)

#set perimissions:r-readmw-write-l-list

$permission = "rwl"

#Create policy

New-AzureStorageContainerStoredAccessPolicy -Context $storageContext -Container "test" -Policy "test" -ExpiryTime $expiryTime -Permission $permission

#Get token

$sasToken = New-AzureStorageContainerSASToken -Name "test" -Policy "test" -Context $storageContext
$sasToken = $sasToken.substring(1)

#Write-Host "SAS token (ref shared access policy): $sasToken"

$sasToken2 = New-AzureStorageContainerSASToken -Context $storageContext -Container tibp-userprofiles -Permission rwl
#Write-Host 'SAS token: ' $($sasToken2)