This is less of a blog post and more about sharing pain in a good humored attempt to help you avoid similar mistakes... It's also another chance to prove my three DevOps axioms:

  • Everything I needed to know about DevOps, I learned on Sesame Street (e.g. troubleshooting is "which one of these is not like the other").
  • There is nothing new under the sun (but contexts change).
  • Engineers by trade are really typesetters by practice (80% of our job boils down to carefully arranging punctuation marks).

Properly escaping complex tasks in Jenkins pipelines is often a game of "how many interpreters get to mess with your head", but don't take my word for it...  just look at this gist, which proves it's not only real but has been driving engineers insane for years.

I knew that going in, I bought my ticket... I've written plenty of Jenkins pipelines, and my use case isn't even complex. I'm not dealing with multi-level interpolation. It's a simple thing, really.

It all started with consuming a few friendly parameters (sanitized to protect the innocent), a common thing done in many pipelines:

sh """ test ! -z '${params.foo}'
       test ! -z '${params.bar}'
       test ! -z '${params.baz}' """
Nothing to see here...

Since the triple doubles (triple doubles? dollar slashy strings? what kind of gibberish have we been reduced to?) were providing sensible interpolation, I single quoted the params. Easy enough, worked great for months.

Then one day an atrocious, unthinkable thing happened... someone submitted a parameter containing a single quote. How dare they!

Wait, it was me? Still, who does that?

Fine, easy fix... let's ensure parameters with single quotes feel comfortable (of course this is now a fix happening at 2 AM, because when else do you find neato bugs?):

sh """ test ! -z \"${params.foo}\"
       test ! -z \"${params.bar}\"
       test ! -z \"${params.baz}\" """
Here lies insanity.

That's kinda ugly. Hmm, oh well at least it works. This is a shell after all. Wait, does it? Is it? Why does this always happen at 2 AM with no coffee in sight?

Turns out, the data contained both single and double quotes. Ruh-roh. Time for a game of bury the backslash with a bonus round of dollar slashy strings...

Wait a minute. Aren't there tools that can help with this insanity? Maybe really old techniques no one talks about that are actually still useful? Things a seasoned engineer would get right in the first place and not wait to discover at 2AM when a timely fix is being deployed?

So I've got data coming in that could contain all kind of quote-confusing, escape-escaping madness. I want to deal with simple strings over the wire, and just get the data safely where it needs to go (it's been sanitized, but some times even sanitized data is insanely formatted). Duh! Encode the data.

In this case a web service was submitting a JSON blob as a pipeline parameter. That worked great until it didn't. Rather than submitting plain text, the pipeline was reverted to the simplest state and base64 encoded data that could be parsed like a friendly ASCII string was submitted instead.

const toBase64 = (string = '') => {
  if (string === '') return new Error('Must provide string to encode')
  return Buffer.from(string).toString('base64')
}

const doSomething = async data => {
  const tag = doWork(data)
  const meta = await lookupSomething(tag)
  const json = JSON.stringify(data)
  const job = await jenkins.getJobInfo('worker')
  
  await jenkins.triggerBuild({
    jobName: 'worker',
    params: {
      foo: toBase64(tag),
      bar: toBase64(meta.whatever),
      baz: toBase64(json)
    }
  })
  return { ...job, meta }
}
Eureka!

Now with a little more work, we can sanely consume base64 using standard UNIX utilities. We need to pad our task's Dockerfile a bit more:

FROM alpine:latest

RUN apk update && \
  apk --no-cache add coreutils
coreutils provides the base64 CLI

Now we can decode the string values:

#!/bin/sh

set -eu

# Jenkins params are base64 encoded
FOO=$(echo ${1} | base64 -d 2>/dev/null)
BAR=$(echo ${2} | base64 -d 2>/dev/null)
BAZ=$(echo ${3} | base64 -d 2>/dev/null)
The UNIX philosophy. Simple tools, working together.

Simple pipeline. Simple data... well, maybe not. More like simpleton assumptions that worked great for moths, until they didn't late one night in an already stressful situation.

Couldn't we just fix the pipeline? Maybe, but better to avoid the problem by submitting properly encoded data in the first place (which also helps avoid foot shooting when this service integrates with other things).

You might not always have the choice of digging through code and fixing the larger problem upstream... and there are always trade-offs, like build logs not being as readable (surmountable with a bit more scripting), but the key takeaway is not to become too cocksure about "simplicity" or the shape of your data. There are genuinely simple tools to help us avoid these kinds of problems. Use them wisely.