In my previous post (which you can find here ) I use the invoke-expression cmdlet for running a Powershell script which was downloaded with invoke-webrequest.
And this was a good solution. The code that was downloaded and executed was a powershell script that would run a private function. This private function then was formated with the three scriptblocks Begin,End and Process.
Parameters where with a same construct being downloaded from a git repository and placed in a powershell Object called $P.
With this approach I separated code parameters from the actual code.
Using GIT I was able to versioning my parameters file, separate from my script code. This setup is working great. And it gives flexibility by leaving the code untouched when changing parameters.
But…
Yeah, a but…. I still needed a way to pass parameters /arguments on the command line. Using invoke-expression… well that wasn’t possible.
So I looked into invoke-command, which has an -argumentlist parameter., making it possible to pass one or more arguments to the script. Using named parameters isn’t possible, which is not what I was looking for.
So to support naming parameters, I decided to introduce just one parameter. And this parameter should be a JSON string, making it possible to pass multiple parameters merged into a JSON object.
The only challenge with this is that all the interpreters that the code was going to pass, should leave the JSON string intact, including the quotes. And I didn’t want to escape any quotes. That would be messy and prone to errors. But encoding it, should solve this issue. The argument is a base64 encoded JSON string.
Is it secure?
Well, No …. not at all… it is a base64 encoding. My goal was not to make it more secure, but that the string wouldn’t be changed by the different shell interpreters.
Off course, you can make it more secure by using private/public key-pairs. You could use a docker volume containing the encoding keys, or other secure methods. When using base64 coding, just don’t pass any sensitive data (passwords) with it. There are other, more secure, approaches for this with containers.
parameter approaches
This setup gives me different approaches to pass parameters to the script. The more static parameters are stored in a .json file, stored in a GIT repository.
And the more dynamic parameters (like VM names to start), are passed via the base64 encoded JSON string.
What changed ?
I changed to following items:
- changed entrypoint string
- using invoke-command instead of invoke-expression
- placing invoke-webrequest inside a scriptblock
- using argumentlist to pass a base64 string, encoded a JSON string
- changed powershell wrapper script to decode inpu
Docker Entrypoint
The previous docker entrypoint was something like
pwsh -Command invoke-expression '$(Invoke-WebRequest -SkipCertificateCheck -uri ' + <git URI> + ' -Headers @{"Cache-Control"="no-store"} )'
The new entrypoint is looking like
pwsh -Command invoke-command -scriptblock ([scriptblock]::Create( (Invoke-WebRequest -SkipCertificateCheck -uri <git URI> -Headers @{"Cache-Control"="no-store"} ).content ) ) -ArgumentList <base64 coded JSON string>
As you can see, the one-liner has grown.
I used the -scriptblock and the -ArgumentList parameter from the invoke-command. The -scriptblock contains the Invoke-webrequest cmdlet which downloads the RAW version of the powershell script on the GIT repository.
The invoke-command cmdlet then executes this scriptblock and passing the argument from the argumentlist to this script.
Script Layer
The script has a wrapper layer, a main layer (containing the Begin,End and Process blocks) and the Process block containing the specific code to run.
<#
.SYNOPSIS
template.ps1 powershell
.PARAMETER inputObject
A JSON string base64 (UTF-8) encoded
#>
param(
[string][Parameter(
ValueFromPipeline = $true,
ValueFromPipelineByPropertyName = $true,
HelpMessage="JSON string base64 (UTF-8) encoded.")]$inputObject=""
)
function main {
}
#-- calling the real powershell code to run
main
main layer (function)
I choose to use the function method to preserve my code format structure. For most of my powershell code I use the End,Begin and Process scriptblocks to structure the code. And I didn’t want to stepp down from that approach.
function main {
<#
.SYNOPSIS
#>
Begin{
$uri = <url to RAW version of parameter file>
#-- trying to load parameters into $P object, preferably json style
try { $webResult= Invoke-WebRequest -SkipCertificateCheck -Uri ($scriptrootURI+$scriptName+".json") -Headers @{"Cache-Control"="no-store"} }
catch {
write-host "uri : " + $scriptrootURI
throw "Request failed for loading parameters.json with uri: " + $webResult
}
# validate answer
if ($webResult.StatusCode -match "^2\d{2}" ) {
# statuscode is 2.. so convert content into object $P
$P = $webResult.content | ConvertFrom-Json
} else {
throw ("Failed to load parameter.json from repository. Got statuscode "+ $webRequest.statusCode)
}
#-- private functions
function exit-script {
...
}
#--- proces inputObject (the argument passed via the cmd line
# decode that inputObject as UTF-8 base64 and convert it to a powershell object
$A= ConvertFrom-Json -InputObject ([System.Text.Encoding]::UTF8.GetString([System.Convert]::FromBase64String($($inputObject)))) -ErrorAction SilentlyContinue -ErrorVariable err1
if ($err1) {
writ-host "Failed to proces input object"
exit-script
}
}
End {
exit-script -exitcode 0
}
Process {
#-- the code that is doing the real work
write-host ($P.world) #-- from the parameter.json file
write-host ($A.universe) #-- passed via the cmd line argument
}
}
Final
So I hope this blog gives you some ideas with you code challenges.
I’m going to write a more structured set of articles , a deep dive into my FaaS – like setup. So keep following this blog, when interested.
And comments, are always welcome.