Rss

Archives for : Coding

Docker on Windows 10 Anniversary – [updated]

So Windows 10 Anniversary is out and it brings with it Containers!

Most likely if you’re like me and already interested in docker, you’ll have installed the Docker App from docker.io and will be wondering why the instructions don’t work.

Basically the docker app for windows doesn’t currently support Windows containers.

If you run docker version you can see why:

The server is running linux. No hosting Windows containers in that.

Now I could uninstall the current docker app and then follow the instructions, but really I want to just quickly try out docker windows containers but also keep using my linux containers.

How I got it working was to tweak the instructions a bit:

  • From the System Tray, right-click on docker and select “Exit Docker”
  • In powershell stop the docker service  Stop-Service com.docker.service
  • Download the Windows container service (ensure you have a c:\temp folder):
  • CD into the c:\temp folder and extract this archive:
  • There are 3 files in the new “docker” folder: docker.exe (which is the same as in the Windows docker app), docker-proxy and dockerd.exe. We need dockerd.exe
  • Register this new dockerd service and start it:
  • Once you start the new docker service you’ll see:

Now your windows nanoserver container will happily install and as per the original instructions:

Unfortunately for me, it just hangs here, and does absolutely nothing. In the end I have to kill the docker.exe process and stop the service.

So still waiting on getting working containers on Windows.

To remove the Windows container service, and return to linux, stop the windows container service (optionally un-register it)

Then start the Docker app service again, the run “Docker for Windows” app again to start the Mobylinux host.

Resolved

Update to the latest version:

 

First Published PowerShell Module

Today I tidied up a PowerShell module that I’ve been using myself for ages and pushed it to PSGallery.

It’s a pretty simple module that scratches one of my big itches with powershell. It’s really hard to write coloured text with the tools provided.

Generally writing colours to the console for command line tools isn’t always simple. Libraries like chalk exist in the NPM world, but I’ve never yet found one for PowerShell. Now one exists 🙂

One of the issues with PowerShell is that the current terminal (really conhost, a process that hosts the shell) is notoriously (even infamously) bad at what it can do. Imagine a feature you’ve seen in a Linux terminal and there’s a very good chance that conhost doesn’t have it. Microsoft has started to finally address these deficiencies, and it’s about time, especially as Windows Core and Nano Server are likely to be very important platforms in the coming future, and all of those will be command-line only.

You can check out the my module on GitHub or simply install it and play with it.

 

Docker & PHP – on Windows

Deciding I needed to learn more about docker and finally start using it, I decided to give it a little try.

Docker on Windows… not so quick

A recent application I downloaded wanted to install an entire WAMP (Windows-Apache-MySQL-PHP) stack on my computer. There’s no way I’m having all of that installed and running.

I knew this was possible in docker, so I thought I’d start my foray into docker land with creating my own WAMP stack. It turned out to be more a VDLNMP stack; as you’ll see.

After much searching amid a very log signal-to-noise ratio, I finally found an excellent tutorial at a place called osteel’s blog.

I was merrily following along until the instruction came to mount the local dev folder into the docker container. Now I’m running on Windows, so I’m using docker machine, and after some searching discovered that docker machine just doesn’t work like that.

I’d have to somehow copy my dev folder to VM that’s hosting the docker containers and then mounting wherever that folder is into the running containers. Too much indirection for me.

Vagrant to the rescue

I then remembered that vagrant forwards local folders very nicely, and into a very predictable location. I also remembered that vagrant supports docker as a provisioner, even better a quick search turned up that there’s a docker-compose plugin for it as well.

I quickly built a very simple vagrant file:

Followed by vagrant up and, lo and behold, everything just worked.

What’s Happening?

  • The vagrantfile uses the basic ubunty box.
  • It then provisions by installing docker & docker-compose
  • The final two options on docker_compose are:
    • the docker-compose.yml to run (it allows multiples)
    • always run this provisioner, e.g. on vagrant reload

Finally to make my life easier I forward the ports on the vagrant box where my containers are running to my local machine.

The full project is available in GitHub.

VDLNMP: Vagrant-Docker-Linux-Nginx-MySQL-PHP

PowerShell which

One thing that quickly annoyed me was typing ‘which’ in my powershell console and getting an error:

Why would I even want that? Because sometimes it’s important to know exactly how the console resolves what you type, especially when some items are in the path multiple times, and some can be obscured by new or other definitions.

Take Ruby. Ruby comes with “ri”, the documentation browser. Turns out in powershell there’s also an alias “ri” for Remove-Item.

The good news is that Get-Command does basically what I want. So a quick addition to my profile:

and I now have my own ‘which’ command:

I can easily find out how to invoke the Ruby “ri” command, without it becoming Remove-Item:

 

ICS Cards – Export Statements

As part of my banking with ABN Amro I get a credit card managed by ICS Cards.

The single most annoying thing about ICS Cards is that they don’t provide any means of exporting my statements. The next most annoying thing is that they haven’t responded to any of my emails request that feature.

Time to help myself then, in true developer fashion, I’ll write my own.

I’ve used loads of languages over the years and different languages suit different tasks. Python turns out to be my go-to language for scraping web-pages and pulling out information from them. Mostly because it’s really easy to knock up a script and because there are loads of great libraries already available to do most of the grunt work.

So here it is. A quick and dirty script to log into my account, parse the monthly statement page, fix the formatting and write it out as CSV.

This is by no means a finished script (have you seen how messy it is!!) and it’s kinda hard-coded what it will download.

Here’s an example output once the script is done with it:

Fixing my %PATH%

After a while working with a computer it’s not unusual for my %PATH% environment variable to become messy.

It tends to contain wrongly formatted paths, or paths that no longer exist from programs that have been uninstalled.

This was causing me problems when I ran a tool that relied on the path and was having trouble with it. Turns out this tool was pretty primitive and a touch too over-sensitive.

I had no idea which path it was choking on, and there are a lot of entries in my %PATH%, and many of them use environment variables, the I thought I’d quickly write a powershell script to fix what’s in there.

There are a couple of simple rules for %PATH%

  • It’s a semi-colon separated list
  • Don’t use quotes (“) in paths
  • Spaces are permitted
  • Environment variables of the form %NAME% are permitted.

The script simply checks each path isn’t surrounded by quotes, ensures the path exists (it has to resolve the environment variables first) and then returns a new string in the correct format for the %PATH% environment.

It returns the script to give you a chance to verify it yourself. But if you just want to update the %PATH% without that step, you can do this:

Where “Fix-Path.ps1” is just a saved copy of the script above.

UML Diagrams the Simple Way

There is a great engineering aphorism to “Keep It Simple Stupid!”, the KISS principle.

Many developers often confuse easy and simple; I’ve done it in the past, and will probably again in the future. Where you stand can a big difference to your perspective.

Take UML Diagrams and particularly drawing and collaborating with them. There are dozens of editors out there that allow you to easily drag/drop/link etc. all your different diagrams and classes. They’ll remember relationships and properties and other funky things. This can make it very easy to draw a diagram, or many diagrams. Unfortunately it can also make life everything other than simple for those you need to collaborate with, updating the diagrams when you lose the tool (e.g. out of business, or license expired) or you don’t have the tool, or it doesn’t run on your computer.

One reason I’m a big fan of the TodoTxt format (and even have my own TodoTxtJs tool) is that it makes life very simple. It’s not the easiest tool to work with, it’s not always easy to do every task, but it is an extremely simple format, and the big advantage of that is compatibility and portability. I don’t need anything more than Notepad to read and manage my Todos, and there’s no way I’ll ever not be able to edit or read them. Tools can make me more productive, but I don’t need them.

Wouldn’t it be great if the same applied to UML diagrams? Interestingly it does. Enter plantUML.

Take this example definition:

Even if you’d never seen the diagram and never seen the syntax before, using a bit of effort, and some knowledge of UML, you’d be able to figure out what it’s describing. That is one aspect of what makes this simple.

In case you’re curious, this is the actual sequence diagram, and that’s just one of the many formats it understands:

UML Sequence Diagram

 

It also makes it a really elegant solution for collaborating with people. You don’t need to send huge binary files around in proprietary formats, just send simple text. You don’t even need the same “compiler” or “renderer”. Just knowing the syntax you could write your own, just as you’re able to understand what it’s showing by reading the text. A more complex diagram might require a paper and pencil 🙂

It also means that if you need to render it in a different way (e.g. colours, shapes, style) just use a different renderer. You have, coincidentally, gone a long way to separate content from presentation, and that’s almost always a good thing.

As an added bonus, if you’re storing your designs and documentation in a source control system, then you can diff a text file much more easily than complex binary file.

The good news is that plantUML is open-source and free, so there’s no likelihood of it ever going away, although it does have some dependencies (GraphViz and Java) but those are also open-source. There are loads of ways to use it, even from within your IDE. My only disappointment is that there’s no VisualStudio plugin yet.

Of course, it’s much easier to drag & drop stuff with a mouse and have lots of funky tools do things for you auto-magically. Where those “easy” tools fall down is that you need to get the software, make sure it works and runs on your set-up, have it available and understand how to use it and, most importantly, that everyone else does too. What makes this text based syntax simple is that anyone can just open the file and work with it using any text editor.

If you can ensure that everyone you will ever work with, and have to collaborate with, will have the same tools; you have the tools and infrastructure to collaborate easily using that tool and you’re sure you’ll keep access to those tools for the entire life of your project/product, then using the “easy” tools might be your best choice.

If you cannot ensure that, then you’re probably much better off looking for a simpler solution like plantUML.

 

 

 

 

Powershell CmdLet Function by Example

One of the big issues I have when writing Powershell functions is that there’s all these little structures and patterns to do things.

So I thought I’d write myself a quick guide in the form of a heavily commented function that does most of what I need. It has support for Pipelining, CmdLetBinding, ShouldProcess and comment based help.

Continue Reading >>

Common Get-ScriptDirectory in Powershell doesn’t always work

This function has been written dozens of times, in dozens of places, by dozens of bloggers, but I keep finding myself looking for it, so I thought I’d make myself a note. But I also discovered that there’s a problem with it.

If you’re writing a script in a module, it’s easy to use $PsScriptRoot  to get the current script path. For others situations there’s:

This function, and ones like it, appear in loads of blog posts, and it generally solves the problem, but it doesn’t always work, though, as it contains a flaw.

The problem is  -Scope 1. The this switch tells the script to look at the parent scope. This works as long as you have the Get-ScriptDirectory function at the top level of your script and you call it from that top level.

Here’s an example script demonstrating the problem:

Invoked as: PS> ./invoctest.ps1you get:

In Script (scope 1): []
In Script (scope 0): [F:tempinvoctest.ps1]
In Script (scope script): [F:tempinvoctest.ps1]
In t() (scope 1): [F:tempinvoctest.ps1]
in q()
In t() (scope 1): []
In t2() (scope script): [F:tempinvoctest.ps1]
in q2()
In t2() (scope script): [F:tempinvoctest.ps1]

Notice how when the call to t() is nested inside q() the function breaks, and you get nothing. The reason is that -Scope 1 is no longer looking at the script scope, but looking at the scope inside function t(). You’d need to have -Scope 2 if the function t() is invoked from inside q(). The higher the nesting level the higher the value to give to -Scope. 

You could look at the call stack, if you wanted to, but my script above already includes the answer. You need to use the “script” option for -Scope. That will ensure you always get the correct invocation variable.

This even works when the script is dot-sourced, or run in a debugger, like in PowerGUI.
The corrected function is therefore:

 

 

Taking Powershell with you

I’m a huge fan of the command line. There’s nothing you cannot do from the command line in Linux; in Windows, things were a lot harder. For too long Microsoft focused on GUI and neglected the command line. (This is still noticeable where the terminal window on windows is concerned, which is still decades behind Linux. I’ve used Console2 for ages, but I just discovered Scott Henselman’s blog and he suggested ConEmu and I’m giving that try now. Looks very good.)

Where Windows has finally caught up is PowerShell. It’s a command line shell, scripting language and all round bad-ass tool. Unlike Linux shells, it’s fully object orientated; whereas BASH and ZSH etc all need to pass strings around, PowerShell uses real objects. This post isn’t really about evangelising PowerShell but if you’re doing any kinds of technical work on Windows, then you should be using PowerShell.
I love customising the environment I work in to make it perfect for me. The annoying thing is that I work on many different computers and it’s a pain keeping all those settings working together. A while back I came across a trick to get VIM settings synced using Dropbox (or a similar cloud file store). I set out to do the same with Powershell.
There’s really three principle things I wanted to do:
  1. Ensure all my computers use the same profile
  2. Ensure all my computers have access to the same modules
  3. Allow computer specific settings to be set easily
I was surprised at how easy it actually was. It took some tweaking to get it perfect, but here his my solution.
PowerShell keeps everything it needs in c:users<login>DocumentsWindowsPowerShell so the first thing I did was copy everything out of there into a folder in Dropbox. I used Settingspowershell.
I now needed a way to connect my actual profile, with the one powershell loads by default. This has to be as simple as possible, since it’s what I’ll need to do on any new computer and I don’t want a dozen line check-list.
Turns out, powershell is happy with a simple dot-source of my profile in the new location. So (since I have different user names at home and work) I calculate the profile location in dropbox and source it. It worked.
I then tested this on another computer and discovered an interesting issue. Dropbox of course downloads files from the ‘net and Windows immediately flags them as being downloaded. PowerShell detects this and won’t run them unless signed. (They’re remote scripts and hence a security risk). I’m therefore lowering the security permissions to allow this. (A better solution might be to remove the download flag on all of them, but there’s a lot of files once modules are included, so this will do for now.)
My profile.ps1 now looks like this:

Next was getting my modules to load. I started by trying to look at way to manipulate the path or to automatically copy the modules to the Powershell folder. This was ridiculously convoluted and there had to be an easier way. It took some searching, but I eventually came across $env:PSModulePath. Yep, the search path for modules.


I therefore mapped a drive to my modules directory and updated the modules path:

My modules all now load without any problems. Note that powershell: is a mapped drive as well.PowerTab was causing some issues, but I fixed those by moving the settings file into the AppData folder; which can be set in the first run wizard. Best solution in any case as different computers will have different things installed, and might need different settings.

That’s all working. The final thing is custom settings per computer. This was actually the easiest, especially once I’d figured out everything for the other two of my requirements and I was able to simply write it:

This looks for a file called profile_<computer>.ps1 and dot-sources it. Simple.

You may wonder about some of my strange variable naming. I tend to like things neat, and I create a whole load of variables during my profile initialisation. I therefore have a naming convention for variables I don’t need once the scripts have finished running, as a side effect of dot-sourcing a script is that the variables don’t drop out of scope.

At the end of my profile therefore I have:

and all my script variables are gone and I’m left with a very neat Powershell session that works the way I want it to, on any computer.