Gitlab 11.6.0 just released, and one of the new features is deleting a pipeline. Why might you want to delete a little bit of history? Well, perhaps you have inadvertently caused some private information to be printed (a password). Or perhaps you left Auto DevOps enabled and ended up with a whole slew of useless failures. Either way you want to go and delete the pipeline in question and all of its output, artifacts, etc. Well here’s a quick ‘hack’ to achieve the goal.

You run it as ‘python -t token -u url -g group -p project’ and it will list all the pipeline id’s. You can then run it with “-d ‘*'” to delete all of them, or “-d #,#,#…” to delete specific ones. You’ll likely want ‘pip install python-gitlab’ first.

Its a quick and dirty hack, but it worked for me. YMMV.

#!/usr/bin/env python3
import gitlab
import argparse
import os
import requests
import sys

parser = argparse.ArgumentParser()
parser.add_argument('-u', '--url', help='API url', default=url)
parser.add_argument('-t', '--token', help='personal token', default=private_token)
parser.add_argument('-g', '--group', help='group', default=None, required=True)
parser.add_argument('-p', '--project', help='project', default=None, required=True)
parser.add_argument('-d', '--delete', help='delete id(s)', default=None)

args = parser.parse_args()

if args.delete:
    args.delete = args.delete.split(',')

gl = gitlab.Gitlab(args.url, args.token)

def find_group(gl, group):
    groups = gl.groups.list()
    for g in groups:
        if == group:
            return g
    return None

def find_subgroup(gl, group, subgroup):
    subgroups = group.subgroups.list()
    for sg in subgroups:
        if == subgroup:
            return sg;
    return None

# if project is of form x/y, then assume x is a subgroup
def find_project(gl, group, project_with_subgroup):
    pws = project_with_subgroup.split('/')
    if len(pws) == 2:
        subgroup = find_subgroup(gl, group, pws[0])
        project = pws[1]
        group = gl.groups.get(, lazy=True)
        project = pws[0]
    projects = group.projects.list()
    for p in projects:
        if == project:
            return p
    return None

group = find_group(gl,
gproject = find_project(gl, group, args.project)
if not gproject:
    print("Error: %s / %s does not exist" % (, args.project))

project = gl.projects.get(
pipelines = project.pipelines.list()
for pid in pipelines:
    pl = project.pipelines.get(
    if args.delete and (str( in args.delete or args.delete == ['*']):
        path = "%s/api/v4/projects/%s/pipelines/%s" % (args.url,,
        r = requests.delete(path, headers={'PRIVATE-TOKEN': args.token})
        print("%s %s %s -- DELETE" % (,, pl.status))
        print("%s %s %s" % (,, pl.status))


All of this is done in the ‘penguin’ container of ‘termina’ (e.g. enable ‘linux’ on the chrome settings). By default its Debian 9.6, and runs Python 3.5. But you might want to run e.g. Quart, which wants a newer rev for some asyncio. So, here goes.

Step 1: Install dev essentials, as root (e.g. sudo)

apt-get update
apt-get install -y build-essential libncurses5-dev libgdbm-dev libnss3-dev libssl-dev libreadline-dev libffi-dev zlib1g-dev

Step 2: Install clang/llvm, as root (e.g. sudo)

echo deb llvm-toolchain-stretch-7 main > /etc/apt/sources.list.d/llvm.list
wget -O -|sudo apt-key add -

apt-get update
apt-get install -y libllvm-7-ocaml-dev libllvm7 llvm-7 llvm-7-dev llvm-7-doc llvm-7-examples llvm-7-runtime clang-7 clang-tools-7 clang-7-doc libclang-common-7-dev libclang-7-dev libclang1-7 clang-format-7 python-clang-7 libfuzzer-7-dev lldb-7 lld-7 libc++-7-dev libc++abi-7-dev libomp-7-dev

update-alternatives --install /usr/bin/llvm-profdata llvm-profdata /usr/bin/llvm-profata-7 90
update-alternatives --install /usr/bin/clang clang /usr/bin/clang-7 90
update-alternatives --install /usr/bin/clang++ clang++ /usr/bin/clang++-7 90

Step 3: Get & Install Python

tar -xzvf Python-3.7.1.tgz
./configure --enable-optimizations
make -j4
make install

And you have yourself some Python 3.7! Pip on!

So most of you will have the Slovak ‘NBU’ on your RSS speed-dial, but I found I was a bit behind on my reading of it. As I was catching up, skcsirt-sa-20170909-pypi caught my eye. In a nutshell, its around a phenomena called ‘typo-squatting’. In this case, Python-package name squatting (called pytosquatting).

So there is a popular package ‘urllib2’. The developers moved on to version 3 (urllib3), deleting the old one. Someone moved in and registered ‘urllib’ and ‘urrlib2’. In turn other unwitting people like you and I would do a ‘pip install urllib’ or ‘import urrlib’. Done, right? Wrong! It behaved properly (so you didn’t notice) and then… well… had side-affects you didn’t want. Other typos included ‘urlib3’ (dropped ‘l’) etc.

Here’s a few they highlighted.

  • acqusition (uploaded 2017-06-03 01:58:01, impersonates acquisition)
  • apidev-coop (uploaded 2017-06-03 05:16:08, impersonates apidev-coop_cms)
  • bzip (uploaded 2017-06-04 07:08:05, impersonates bz2file)
  • crypt (uploaded 2017-06-03 08:03:14, impersonates crypto)
  • django-server (uploaded 2017-06-02 08:22:23, impersonates django-server-guardian-api)
  • pwd (uploaded 2017-06-02 13:12:33, impersonates pwdhash)
  • setup-tools (uploaded 2017-06-02 08:54:44, impersonates setuptools)
  • telnet (uploaded 2017-06-02 15:35:05, impersonates telnetsrvlib)
  • urlib3 (uploaded 2017-06-02 07:09:29, impersonates urllib3)
  • urllib (uploaded 2017-06-02 07:03:37, impersonates urllib3)


It finally happened to you. A developer used ‘import A’. A pulled in B, B pulled in C, D. D pulled in E… and somewhere along that chain evil lurked. Now all your bits are belong to l33t hackerz.

So like all things in life its time to over-react after the fact (something about a barn door and a horse).

And like all good things in life research ~= google. So you do. And you find… shocking… a set of tools. Sadly language specific, but lets not be greedy.

What these tools do is glom onto your git repo and snoop around. They find a ‘requirements.txt’ etc, parse it. Then they go find those packages, parse them, and so on. A tree is built. And then they watch the vulnerability stream of those upstreams for you. Some even conveniently issue a Pull-Request to your repo when they find an issue for you!

And cuz its (nearly 2019), they all have an API, a freemium business model, and some ‘open-ish’ source on Github.

But, tl;dr: if you are waiting for bad things to happen to good software in production as a means of knowing you have a security issue, maybe you should look at moving it upstream with some auto-security-dependency-tracking. You can maybe merge this with your SAST platform (like Clair).