Visual Studio Power Environment

It is sometimes interesting to speculate about how unusual feature patterns in Microsoft projects may reflect the traditions or prejudices of the teams involved with creating and evolving those projects. Today's example is "%VS140COMNTOOLS%"..\..\vc\bin\vcvars32.bat, that wily little batch script that mucks your command environment enough to enable most Visual Studio declarative-like thingies to happen (easily) on the command prompt.

So PowerShell has been around for like 17 years now (if you include the Monad proto period). That's enough time for an effective rewrite or two of the Visual Studio codebase and yet we're still stuck with this inconvenient script written in possibly the worst shell still in popular use today. How can we get it to work with possibly the best shell in popular use today? Unfortunately there is no single command that will dump the environment of a cmd subprocess into the parent powershell environment. I found several hacks online proposing to do this, the most correct and elegant being Invoke-Environment.

Note

[Digression] reStructureText's link formatting is egregiously inferior.

What I really wanted was some Groovy code that will dynamically modify the environment of a declarative pipeline with the results of vcvars32.bat. Jenkins pipeline allows for setting environment variables dynamically, but the dynamism packaged into this feature only allows for arbitrary values for a set of a priori variable names. The vcvars32.bat script sets an indeterminate set of variable names. There may still be a way to do this, but my Groovy-fu does not show me how to edit the env object at sufficient introspection.

Here is the essential functionality of Invoke-Environment applied to vcvars32.bat. This is what I used to prime the PowerShell environment for Visual Studio work by pasting the code block directly into the powershell() step before executing Visual Studio commands. This is not the most elegant, but it is only a few more lines than directly sourcing the batch script and seems to be the plainest way to absorb its environment.

foreach ($_ in cmd /c "`"%VS140COMNTOOLS%`"..\\..\\vc\\bin\\vcvars32.bat > nul 2>&1 & SET") {
    if ($_ -match '^([^=]+)=(.*)') {
        [System.Environment]::SetEnvironmentVariable($matches[1], $matches[2])
    }
}

Note

If you are using this in a Jenkins pipeline, all $'s will need to be escaped, \$, to signify that they signal PowerShell variables and not Groovy variables. Also ` is PowerShell's escape character, so `" is an escaped quote.

How To Mount LUKS

Warning

This code is meant to be a trenchant example of the essential actions necessary. You are crazy and I am not responsible if you copy and paste.

  • Mount LUKS:
    apt install cryptsetup
    LUKS_BLKDEV=$(blkid | grep crypto_LUKS | cut -d ':' -f 1)
    cryptsetup luksOpen ${LUKS_BLKDEV} root
    # Enter passphrase
    mkdir /mnt/root && mount /dev/mapper/root /mnt/luks_root  # Skip this step if `/dev/mapper/root` is an LVM partition.
    
  • Mount LVM:

    Note

    Ubuntu seems to always use LVM inside of LUKS, so this is how to setup the LVM layer once LUKS has been decrypted.

    apt install lvm2
    VOL_GRP=$(vgscan | tail -n 1 | cut -d '"' -f 2)
    vgchange -ay ${VOL_GRP}
    mkdir /mnt/root && mount /dev/${VOL_GRP}/root /mnt/root
    

Changing the Package Your BlueJeans Came in

Recently my company decided to move from WebEx to BlueJeans for conferencing. While they do provide linux support, such support is limited to RPM-based systems.

I can sympathize with this because although RPM spec is an antique format with plenty of arcane customs and traditions, if you start the day with a goal to create an RPM and your app is not too complex, you will have an RPM at the end of the day if not sooner. Replace RPM with deb and you will likely find yourself lost in a world you never knew existed. At the end of the day if you have a deb it means you're already a debian monk or are ready to become such.

The BlueJeans app itself 'seems' to be just another electron app or similar. Thus, adapting it to a deb-based system turned out to be trivial with a little alien magic. Here's the salt state I used:

{%- set bj_full_ver = '1.37.22' %}
{%- set bj_part_ver = bj_full_ver.rsplit('.', 1)[0] %}
{%- set bj_rpm = 'bluejeans-{}.x86_64.rpm'.format(bj_full_ver) %}

bluejeans-deps:
  pkg.installed:
    - name: BlueJeans deps
    - pkgs:
      - libgconf-2-4
      - libudev-dev
  file.symlink:
    - name: /lib/x86_64-linux-gnu/libudev.so.0
    - target: /lib/x86_64-linux-gnu/libudev.so
    - require:
      - pkg: bluejeans-deps

bluejeans-get:
  cmd.run:
    - name: wget https://swdl.bluejeans.com/desktop/linux/{{ bj_part_ver }}/{{ bj_full_ver }}/{{ bj_rpm }} --output-document /tmp/{{ bj_rpm }}
    - unless:
      - dpkg -l bluejeans
bluejeans-convert:
  cmd.run:
    - name: alien --keep-version --scripts /tmp/{{ bj_rpm }}
    - cwd: /tmp
    - onchanges:
      - cmd: bluejeans-get
bluejeans-install:
  cmd.run:
    - name: dpkg --install /tmp/bluejeans_{{ bj_full_ver }}-1_amd64.deb
    - onchanges:
      - cmd: bluejeans-get
    - require:
      - cmd: bluejeans-convert
bluejeans-clean:
  cmd.run:
    - name: rm --force /tmp/bluejeans-*
    - onchanges:
      - cmd: bluejeans-get
    - require:
      - cmd: bluejeans-install

While this gets the job done, it's not very elegant, and in these modern times one expects packages to either bundle into the base package repo that comes with the distro or come out of a gpg-signed app repo. A bare package URL is kind of amateur especially considering how easy it is to setup a repo (createrepo ftw). Also a debian repo is not too hard to setup once you invest the time to learn. But I'm willing to admit to being a DevOps engineer with smug opinions about how the world should function.

Python ArgParse Columns

Have you ever been bothered by cramped help text formatted by Python's argparse module, especially for its affinity for enthusiastic left indents? It has that distinctive format universally discernible for features good and unfortunate. Today we are going to explore and annihilate one of these features of misfortune. Like pip, pytest, and despite what the authors may claim in effect, for it's abstruse Java-celebrating recursions into metastatic blandness: also the logging module--they all depress onto the user some burden of monolithic inflexibility in a way that feels as if the user shall be prevented from getting the job done by the nonblocking opinion of aloof, indifferent code. In the best case, the job is done with extra provisions for these superior opinions that provide comedic commentary written in code on such opinions. In the case of argparse, the formatting customization is limited to 3 simplistic choices. Subclassing is not an option because the way these classes affect argparse output behavior is hidden in source code (obfuscated from sane external minds as that unwholsome logic called implementation detail). However, though learning how awfully the code you routinely employ is hampered by irritating internal rigidity, this particular black box needn't inevitably stumble anyone since the columns conundrum itself yields readily to a few lines of boilerplate.

Problem

Observe:

#/usr/bin/env python3
# my_script.py
import argparse


def _get_opts():
    """
    Setup options for my_script
    """
    ap = argparse.ArgumentParser(description='My script does something, I guess', formatter_class=argparse.ArgumentDefaultsHelpFormatter)
    ap.add_argument('-a', '--an-argument-with-a-really-long-help-msg', help='The intent of this help string is to demonstrate how unreadable a hard line wrap at a narrow column span looks when there is an expansive, capacious field of space to the right of this text available in the terminal.  The environment variable os.environ[\'COLUMNS\'] does not inherit the parent process\'s (a bash shell in this case) $COLUMNS value.')
    return vars(ap.parse_args())


def my_action():
    """
    Run the action of my script
    """
    opts = _get_opts()
    raise NotImplementedError('Really, I don\'t plan to do anything at all here')

if __name__ == '__main__':
    my_action()
$ python3 my_script.py --help
usage: my_script.py [-h] [-a AN_ARGUMENT_WITH_A_REALLY_LONG_HELP_MSG]

My script does something, I guess

optional arguments:
  -h, --help            show this help message and exit
  -a AN_ARGUMENT_WITH_A_REALLY_LONG_HELP_MSG, --an-argument-with-a-really-long-help-msg AN_ARGUMENT_WITH_A_REALLY_LONG_HELP_MSG
                        The intent of this help string is to demonstrate how
                        unreadable a hard line wrap at a narrow column span
                        looks when there is an expansive, capacious field of
                        space to the right of this text available in the
                        terminal. The environment variable
                        os.environ['COLUMNS'] does not inherit the parent
                        process's (a bash shell in this case) $COLUMNS value.
                        (default: None)

Solution

Since 3.3, the shutil module provides a way to insert the terminal's geometry into your script in a way that argparse comprehends:

import shutil
import argparse

# Necessary boilerplate to imbue the argparse help output with essential readability
os.environ['COLUMNS'] = str(shutil.get_terminal_size().columns)
$ python3 my_script.py --help
usage: my_script.py [-h] [-a AN_ARGUMENT_WITH_A_REALLY_LONG_HELP_MSG]

My script does something, I guess

optional arguments:
  -h, --help            show this help message and exit
  -a AN_ARGUMENT_WITH_A_REALLY_LONG_HELP_MSG, --an-argument-with-a-really-long-help-msg AN_ARGUMENT_WITH_A_REALLY_LONG_HELP_MSG
                        The intent of this help string is to demonstrate how unreadable a hard line wrap at a narrow column span looks when there is an expansive, capacious field of space to the right of this text available in the terminal. The environment variable os.environ['COLUMNS'] does not inherit the parent
                        process's (a bash shell in this case) $COLUMNS value. (default: None)

Shell Function to Execute a Command and Return the Output

'Shell', whatever that means is one or other species of a collection of ancient language families complicated by an indefinite permutation of complementary, supplementary, and conflictiary forward, backward, and sideward compatibility relations. Lately shell usually means bash, although I know there will be disputes even here. The primary argument for this convention is that all major distros are setup with bash as the default shell. Being an ancient language, its quirks and compatibility knots are more intrusive than what one would desire coming from experience with modern languages, so any idiom that is well completed, clear, and expressive feels like an achievement worth recording.

Here is a function that I created but, after code review, discovered I did not need, but also want to remember. It executes its first argument and 'returns' its output into its second argument. Whether this is actually 'POSIX compliant' I do not know, but that it works on GNU Bash 4.1+ seems to be careful enough to certify my uncertainty constraints.

function EXECUTE()
{
    # $1: Command to execute
    # $2: Capture the output

    if [ -n "${MY_SCRIPT_TEST}" ] ; then
        LOG "${FMT_BLUE}" '' "Would have executed: ${1}"
    else
        LOG "${FMT_MAGENTA}" '' "Executing: ${1}"
        # Set arg 2 to the output resulting from the execution of arg 1
        eval "$2='$(eval ${1})'"
    fi
}

Here's the function in the context of some other useful functions that altogether does something that could be useful and well-defined.

function USAGE()
{
    printf -- '%s\n' "This is my script that performs some actions"
    printf -- '%s\n' "set MY_SCRIPT_COLORS=1 to get colorized output and"
    printf -- '%s\n' "set MY_SCRIPT_TEST=1 to pretend"
}


function SETUP_LOG()
{
    local LOG_DIR=${1}

    VALIDATE_DIR "${LOG_DIR}" 'logging directory'

    # Global variables ########################################################
    PRINTF_HAS_FMT_B=''
    ###########################################################################
    printf '%b%s' 'str' &> /dev/null
    [ "$?" = "0" ] && PRINTF_HAS_FMT_B='yes'
}


function LOG()
{
    local MESSAGE_COLOR=$1
    local PROCESS_ESCAPES=$2
    local MESSAGE=$3
    local TIME_STAMP=$(date +'%Y-%m-%dT%H:%M:%S')

    # Process escape sequences in message if requested and available
    if [ -n "${PROCESS_ESCAPES}" ] ; then
        if [ -n "${PRINTF_HAS_FMT_B}" ] ; then
            MESSAGE=$(printf '%b%s' "${MESSAGE}")
        fi
    fi

    # Log to stdout
    if [ -n "${MY_SCRIPT_COLORS}" ] ; then
        printf -- '\e[00;%sm%s\e[0m\n' "${MESSAGE_COLOR}" "${MESSAGE}"
    else
        printf -- '%s\n' "${MESSAGE}"
    fi

    # Log to log file
    if [ -f "${LOG_FILE}" ] ; then
        printf -- '[%s] %s\n' "${TIME_STAMP}" "${MESSAGE}" >> "${LOG_FILE}"
    fi
}


function VALIDATE_DIR()
{
    local DIR=$1
    local DESC=$2

    if [ -z "${DIR}" ] ; then
        LOG "${FMT_RED}" '' "Please provide the ${DESC}"
        exit 1
    fi

    mkdir -p "${DIR}"
    if [ ! -d "${DIR}" ] ; then
        LOG "${FMT_RED}" '' "Cannot ensure directory ${DIR}"
        exit 1
    fi
}


function VALIDATE_ARGS()
{
    # Global variables ########################################################
    FMT_BLACK=30
    FMT_RED=31
    FMT_GREEN=32
    FMT_YELLOW=33
    FMT_BLUE=34
    FMT_MAGENTA=35
    FMT_CYAN=36
    FMT_WHITE=37
    FMT_EXTENDED=38
    FMT_DEFAULT=39

    LOG_DIR='/tmp/my_script'
    LOG_FILE="${LOG_DIR}/my_script.log"
    ###########################################################################

    # Display help message
    local HELP=''
    case "${1}" in
        '-h') HELP='yes' ;;
        '--help') HELP='yes' ;;
        'help') HELP='yes' ;;
        *) ;;
    esac
    if [ -n "${HELP}" ] ; then
        USAGE
        exit 0
    fi

    SETUP_LOG "${LOG_DIR}"
    LOG "${FMT_CYAN}" '' "Executing command line: ${0} ${*}"
}


function EXECUTE()
{
    # $1: Command to execute
    # $2: Capture the output

    if [ -n "${MY_SCRIPT_TEST}" ] ; then
        LOG "${FMT_BLUE}" '' "Would have executed: ${1}"
    else
        LOG "${FMT_MAGENTA}" '' "Executing: ${1}"
        # Set arg 2 to the output resulting from the execution of arg 1
        eval "$2='$(eval ${1})'"
    fi
}


function PERFORM_SOME_ACTIONS()
{
    local TMP_FILE_COUNT=''
    EXECUTE 'find /tmp -type f | wc -l' TMP_FILE_COUNT
    LOG "Temporary file count is ${TMP_FILE_COUNT}"
}


VALIDATE_ARGS $@
PERFORM_SOME_ACTIONS

Whose Proxy?

There are many diagrams and word salads presented in cause of explaining forward and reverse (backward? I like linguistic symmetries) proxies. Munificent logic and inholistic reason thus spewed confounded me in my simple desire to know. Fortunately for you, in a dream I had last night I explained it to a dream person thus, and when I had spoken it I knew it was the reductive idea purged of all contrarian confounding encumberment; clairvoyance having carried it forward until now. There is no need to fret about the client or server not know about the presence of the proxy in either case.

In a forward proxy, the server communicates with the proxy as a client. In a reverse proxy, the client communicates with the proxy as a server.

Random Piths

Frameworks vs Libraries (or Systems vs Tools)

Frameworks tend to make your design decisions for you before your project ever exists so you are always going to be constrained to the framework's approach, which invariably yields a selection of the problem space much smaller than the total space advertised by the framework that it is the solution of. These blind spots arise mainly from the designers' ignorance and/or bliss. Also there are always hidden and abstracted layers of auto code that you don't know about and/or don't want to know about.

Libraries tend to be collections of utilities where each is maximized to their most useful and basic utility without diluting the value of each utility below the threshold of convenience, novelty, or beauty. Good libraries don't impose state or evolutionary patterns on your project.

Conclusion: Use a framework for something dreadfully time or resource constrained with the idea to redo it later.

On Selecting, Editing, and Using a Development Pattern or Design

Often trivial design elements, whether used for pedagogical work or because they posit a facile but expressive symmetry, are extrapolated due to affinity for their deceptively elegant ontology into absurd abstrusions that abrade utility in a most vexing and distracting counterpoint. When the prescribed design presides over the dictates of the existential problem or need set, the design eclipses the problem by a scale measured in orders of magnitude.

Conclusion: Don't code to a design when the design malevolently subverts the problem you are trying to solve.

The Proper Method of Maintaining Institutional Knowledge

Ideally, each engineer on a team will actively promote and disseminate their essential institutional knowledge until at least one complete logical copy thereof exists in and can be assembled from the P2P context of the team's total knowledge collective.

Conclusion: Don't hide or allow to be hidden essential knowledge from the team's communication network.

iptables: -p vs --proto vs --protocol

Observe the following three iptables commands:

# iptables -A INPUT -s 10.20.0.0/24 -d 10.10.0.0/24 -i eth0 -m policy -p esp --dir in --pol ipsec --reqid 1 -j ACCEPT
# iptables -A INPUT -s 10.20.0.0/24 -d 10.10.0.0/24 -i eth0 -m policy --proto esp --dir in --pol ipsec --reqid 1 -j ACCEPT
# iptables -A INPUT -s 10.20.0.0/24 -d 10.10.0.0/24 -i eth0 -m policy --protocol esp --dir in --pol ipsec --reqid 1 -j ACCEPT

Whereas --proto is a valid synonym for --protocol (as are -p, --proto[c[o[l]]]), if it, namely the exact token --proto, appears in the rule after -m policy, it will be appropriated by the policy extension of iptables. Indeed, the resulting rules as reported by iptables -vL are:

# iptables -vL
Chain INPUT (policy ACCEPT 0 packets, 0 bytes)
 pkts bytes target     prot opt in     out     source               destination
    0     0 ACCEPT     esp  --  eth0   any     10.20.0.0/24         10.10.0.0/24         policy match dir in pol ipsec reqid 1
    0     0 ACCEPT     all  --  eth0   any     10.20.0.0/24         10.10.0.0/24         policy match dir in pol ipsec reqid 1 proto esp
    0     0 ACCEPT     esp  --  eth0   any     10.20.0.0/24         10.10.0.0/24         policy match dir in pol ipsec reqid 1

This token overloading is an unfortunate design conflict, and such subtlety confounded SaltStack's iptables state module, though not its execution module. It's also really confusing unless you know that --proto can be used in two distinct ways in two distinct places.

So, according to the iptables(8) and iptables-extensions(8) man pages, -p, --proto[c[o[l]]] specify "The protocol of the rule or of the packet to check." and --proto as used by the IPSec policy extension matches the encapsulation protocol. Evidently the encapsulation protocol is different from the protocol used for IPSec traffic. I still have more to learn about this, otherwise I would be more helpful here.

strongSwan VPN Between Two VMs

A Pleasant Discovery

In the course of technological events arising from the labor of one's job, frequently the occasion occurs that one finds oneself either forestalled by uncertainty and variability in setting up an unfamiliar system/tool or confused by the complexity of such system/tool and its unwholesome documentation or both. (The appreciable frequency of this scenario being one of the reasons why filtering on flat task lists for job candidates often proves so foolhardy even as it is defended as the safe choice. Rather, Aristotle I think would advise not to look upon a person's resume, but at what that person will do next.)

Read more…

Conjecture Threshold

Today at his blog, Peter Woit responded to some rebuke or other from the stringeratti cabal on his disrespect for their sacred creed and its inevitable ascendance. He's got a severe commenting policy there and I don't begrudge it. I can imagine someone with his votive alert for bullshit borne out of disgust of general delusion as an ecclesiastical order coopts fundamental physics in a way everyone loves to compare to the celebrated hubris that was shattered 113 years ago--I can imagine someone in his position carefully inscribing himself from the ruling academic class with an appreciable buffer that protects comments as well.

But this is my blog and, waxing bold in my own sovereign licensure, I have neither professional apprehension towards the high court of ivy strangled decorum nor reverence for an ostensible ranking of living minds and I grant myself permanent freedom to post whatever I fancy. The following comment was deemed inappropriate for the curated safe comment space on the blog itself:

If we're now in the business of measuring and comparing the 'intelligence' housed in discrete human instances, I'd say that Woit's smarter than any of those anointed to ascend into the high ivory tower incapable of discerning that the multiverse has no clothes, despite whatever whizbang smoke and mirrors they're able to conjure in its favor.

If this comment seems gratuitously homenim, it's because I err towards frustration with the status quo keepers of the theoretical enterprises in fundamental physics rather than wanting or not to construct Woit paean.