Rss

Archives for : powershell

First Published PowerShell Module

Today I tidied up a PowerShell module that I’ve been using myself for ages and pushed it to PSGallery.

It’s a pretty simple module that scratches one of my big itches with powershell. It’s really hard to write coloured text with the tools provided.

Generally writing colours to the console for command line tools isn’t always simple. Libraries like chalk exist in the NPM world, but I’ve never yet found one for PowerShell. Now one exists 🙂

One of the issues with PowerShell is that the current terminal (really conhost, a process that hosts the shell) is notoriously (even infamously) bad at what it can do. Imagine a feature you’ve seen in a Linux terminal and there’s a very good chance that conhost doesn’t have it. Microsoft has started to finally address these deficiencies, and it’s about time, especially as Windows Core and Nano Server are likely to be very important platforms in the coming future, and all of those will be command-line only.

You can check out the my module on GitHub or simply install it and play with it.

 

Fixing my %PATH%

After a while working with a computer it’s not unusual for my %PATH% environment variable to become messy.

It tends to contain wrongly formatted paths, or paths that no longer exist from programs that have been uninstalled.

This was causing me problems when I ran a tool that relied on the path and was having trouble with it. Turns out this tool was pretty primitive and a touch too over-sensitive.

I had no idea which path it was choking on, and there are a lot of entries in my %PATH%, and many of them use environment variables, the I thought I’d quickly write a powershell script to fix what’s in there.

There are a couple of simple rules for %PATH%

  • It’s a semi-colon separated list
  • Don’t use quotes (“) in paths
  • Spaces are permitted
  • Environment variables of the form %NAME% are permitted.

The script simply checks each path isn’t surrounded by quotes, ensures the path exists (it has to resolve the environment variables first) and then returns a new string in the correct format for the %PATH% environment.

It returns the script to give you a chance to verify it yourself. But if you just want to update the %PATH% without that step, you can do this:

Where “Fix-Path.ps1” is just a saved copy of the script above.

Powershell CmdLet Function by Example

One of the big issues I have when writing Powershell functions is that there’s all these little structures and patterns to do things.

So I thought I’d write myself a quick guide in the form of a heavily commented function that does most of what I need. It has support for Pipelining, CmdLetBinding, ShouldProcess and comment based help.

Continue Reading >>

Common Get-ScriptDirectory in Powershell doesn’t always work

This function has been written dozens of times, in dozens of places, by dozens of bloggers, but I keep finding myself looking for it, so I thought I’d make myself a note. But I also discovered that there’s a problem with it.

If you’re writing a script in a module, it’s easy to use $PsScriptRoot  to get the current script path. For others situations there’s:

This function, and ones like it, appear in loads of blog posts, and it generally solves the problem, but it doesn’t always work, though, as it contains a flaw.

The problem is  -Scope 1. The this switch tells the script to look at the parent scope. This works as long as you have the Get-ScriptDirectory function at the top level of your script and you call it from that top level.

Here’s an example script demonstrating the problem:

Invoked as: PS> ./invoctest.ps1you get:

In Script (scope 1): []
In Script (scope 0): [F:tempinvoctest.ps1]
In Script (scope script): [F:tempinvoctest.ps1]
In t() (scope 1): [F:tempinvoctest.ps1]
in q()
In t() (scope 1): []
In t2() (scope script): [F:tempinvoctest.ps1]
in q2()
In t2() (scope script): [F:tempinvoctest.ps1]

Notice how when the call to t() is nested inside q() the function breaks, and you get nothing. The reason is that -Scope 1 is no longer looking at the script scope, but looking at the scope inside function t(). You’d need to have -Scope 2 if the function t() is invoked from inside q(). The higher the nesting level the higher the value to give to -Scope. 

You could look at the call stack, if you wanted to, but my script above already includes the answer. You need to use the “script” option for -Scope. That will ensure you always get the correct invocation variable.

This even works when the script is dot-sourced, or run in a debugger, like in PowerGUI.
The corrected function is therefore:

 

 

Taking Powershell with you

I’m a huge fan of the command line. There’s nothing you cannot do from the command line in Linux; in Windows, things were a lot harder. For too long Microsoft focused on GUI and neglected the command line. (This is still noticeable where the terminal window on windows is concerned, which is still decades behind Linux. I’ve used Console2 for ages, but I just discovered Scott Henselman’s blog and he suggested ConEmu and I’m giving that try now. Looks very good.)

Where Windows has finally caught up is PowerShell. It’s a command line shell, scripting language and all round bad-ass tool. Unlike Linux shells, it’s fully object orientated; whereas BASH and ZSH etc all need to pass strings around, PowerShell uses real objects. This post isn’t really about evangelising PowerShell but if you’re doing any kinds of technical work on Windows, then you should be using PowerShell.
I love customising the environment I work in to make it perfect for me. The annoying thing is that I work on many different computers and it’s a pain keeping all those settings working together. A while back I came across a trick to get VIM settings synced using Dropbox (or a similar cloud file store). I set out to do the same with Powershell.
There’s really three principle things I wanted to do:
  1. Ensure all my computers use the same profile
  2. Ensure all my computers have access to the same modules
  3. Allow computer specific settings to be set easily
I was surprised at how easy it actually was. It took some tweaking to get it perfect, but here his my solution.
PowerShell keeps everything it needs in c:users<login>DocumentsWindowsPowerShell so the first thing I did was copy everything out of there into a folder in Dropbox. I used Settingspowershell.
I now needed a way to connect my actual profile, with the one powershell loads by default. This has to be as simple as possible, since it’s what I’ll need to do on any new computer and I don’t want a dozen line check-list.
Turns out, powershell is happy with a simple dot-source of my profile in the new location. So (since I have different user names at home and work) I calculate the profile location in dropbox and source it. It worked.
I then tested this on another computer and discovered an interesting issue. Dropbox of course downloads files from the ‘net and Windows immediately flags them as being downloaded. PowerShell detects this and won’t run them unless signed. (They’re remote scripts and hence a security risk). I’m therefore lowering the security permissions to allow this. (A better solution might be to remove the download flag on all of them, but there’s a lot of files once modules are included, so this will do for now.)
My profile.ps1 now looks like this:

Next was getting my modules to load. I started by trying to look at way to manipulate the path or to automatically copy the modules to the Powershell folder. This was ridiculously convoluted and there had to be an easier way. It took some searching, but I eventually came across $env:PSModulePath. Yep, the search path for modules.


I therefore mapped a drive to my modules directory and updated the modules path:

My modules all now load without any problems. Note that powershell: is a mapped drive as well.PowerTab was causing some issues, but I fixed those by moving the settings file into the AppData folder; which can be set in the first run wizard. Best solution in any case as different computers will have different things installed, and might need different settings.

That’s all working. The final thing is custom settings per computer. This was actually the easiest, especially once I’d figured out everything for the other two of my requirements and I was able to simply write it:

This looks for a file called profile_<computer>.ps1 and dot-sources it. Simple.

You may wonder about some of my strange variable naming. I tend to like things neat, and I create a whole load of variables during my profile initialisation. I therefore have a naming convention for variables I don’t need once the scripts have finished running, as a side effect of dot-sourcing a script is that the variables don’t drop out of scope.

At the end of my profile therefore I have:

and all my script variables are gone and I’m left with a very neat Powershell session that works the way I want it to, on any computer.