On the off-chance you’ve came here via my Twitter link, I’m dreadfully sorry to say that Twitter doesn’t let me (often) follow new people: I’m at a limit, and it takes about 20 new followers before I can follow one new person.

It’s a complete arse. I’m trying to use app.net more, but let’s face it, app.net is very niche.

Feel free to at-me, though.

I failed to find a good example of something that worked to pull a spreadsheet from google-docs using cURL. All that I found didn’t work, in one shape or another.

A bit of playing, and quite a bit of reading got this

#!/bin/bash
PASS=`cat /path/to/0600/google-password-file`
SHEET="https://spreadsheets.google.com/feeds/download/spreadsheets/Exportkey=addyourownsheetIDhere&#038;exportFormat=csv&#038;gid="</p>
<p>AUTH_TOKE=`curl --silent https://www.google.com/accounts/ClientLogin -d \
    Email=foo@example.org -d \
    Passwd=${PASS} -d \
    accountType=HOSTED -d \
    source=cURL-SpreadPull -d \
    service=wise | grep Auth\= | sed 's/Auth/auth/'`</p>
<p>curl --silent --output /path/to/file --header "GData-Version: 3.0" --header "Authorization: GoogleLogin ${AUTH_TOKE}" "${SHEET}${TAB}"

seemed to do the trick

Exportkey could be defined in the script, as a variable, thinking about it. You’ll need to supply that; I typically grab it from the web-based URI, but there is a warning in the docs about that:


To determine the URL of a cell-based feed for a given worksheet, get the worksheets metafeed and examine the element in which rel is http://schemas.google.com/spreadsheets/2006#cellsfeed. The href value in that element is the cell feed’s URI.

YMMV.

I’ve added in &#038;exportFormat=csv&#038;gid= because I wanted CSV outputs, and gid’s value is provided via a for … and case deviation.

--header "GData-Version: 3.0" was needed to avoid the redirection.

Hopefully, this might be of benefit — as a working (when written) example of using curl and google docs/google spreadsheets.

Having finally got fed up with logging in, individually, to upgrade each of the no2id machines and jails, a bit ago, I decided to write a script to do the ‘hard work’ for me.

This worked fine, until today, when I noticed apt-listbugs complaining, and causing the script to fail to dist-upgrade.

Not a problem, thought I. I’m sure others have had this issue too. Being lazy, I thought first point of call would be the internets. I’d have thought something like:

"DEBIAN_FRONTEND=noninteractive" "apt-listbugs"

might have done the trick. It didn’t (that I could find).

So I went back to doing what a lot of the new-breed of ‘devops’ fail to do, and what I’m quite hypocritical of; looking at the manpage.

The manpage provides us with this gem:

ENVIRONMENT VARIABLES
o APT_LISTBUGS_FRONTEND If this variable is set to “none”
apt-listbugs will not execute at all, this might be useful if
you would like to script the use of a program that calls
apt-listbugs.

So there we go.

 for M in $MACHINES
 do
     echo "Connecting to ${M}.no2id.net"
-    ssh root@${M}.no2id.net 'export TERM=xterm; export DEBIAN_FRONTEND=noninteractive; apt-get update &#038;&#038; echo "" &#038;&#038; echo "" &#038;&#038; echo "This is "'${M}'".no2id.net" &#038;&#038; echo "" &#038;&#038; echo "" &#038;&#038; apt-get dist-upgrade'
+    ssh root@${M}.no2id.net 'export TERM=xterm; export DEBIAN_FRONTEND=noninteractive; export APT_LISTBUGS_FRONTEND=none; apt-get update &#038;&#038; echo "" &#038;&#038; echo "" &#038;&#038; echo "This is "'${M}'".no2id.net" &#038;&#038; echo "" &#038;&#038; echo "" &#038;&#038; apt-get dist-upgrade'
done

hopefully, this will help others, whose first port of call is the internets, and not manpages.

You may, however be sensible — and have had the time to roll out Puppet (ugh, when did they change their website! Why‽) or Chef though.

Here’s what I’ve just sent to the London Decompression leads’ list; sadly, sometimes we have to take a stand for what we believe in :(

Following on from information about the proposed venue (‘Cable’) for this year’s London Decompression, their insistence on using “Clubscan” and my principles, I feel I can no longer be involved, or participate in this year’s London Decompression event.

Should there be a change in their deployment of the Clubscan products/services (or any such similar ones), I may be able to reconsider this.

I’m really quite disappointed that, as a group, we’ve not – to my knowledge (nothing’s been mentioned about it) – regarded privacy as a concern, or carried out (for example) a Privacy Impact Assessment.

Whilst many folks are content to abrogate their privacy, for some of us, it’s been an issue that we – and our forebearers – have fought for – and are still fighting: including taking matters to the courts.

It’s through privacy campaigning, that I became introduced to the Burner Community (as strange as that may seem). I know I’m not the only privacy-aware/concerned individual from amongst our community.

That’s even before the issue of bringing out passports/photo ID, let alone surrendering the data.

I should make it clear that, whilst I am a fairly well documented privacy campaigner, I’m far from the irrational knee-jerk contingency; I review matters and thence form rational decisions.

The whole concept irks me immensely, putting me at ideological differences with the rest of you.

In the spirit of collective decision making, I no longer wish to take a part, in the organizing, or the participation, sadly, of this year’s London Decompression. I know of friends who will not be participating, too.

As such, I have no choice but to step-down with immediate effect. Please do not contact me regarding anything relating to the London Decompression, unless it’s to say the Southwark Council have rescinded, and that the council/licensing board/venue chosen takes privacy seriously, and ceases to insist on the installation and use of any of the “Clubscan” sort of products. The whole principle – and their “argument” for the deployment, – does not, to me, hold water.

For the time being, whilst I further evaluate my feelings/options, I will not be rescinding the use of my domains. However, should this contemptuous disregard of privacy continue, the offer will not be available again.

I honestly wish that I could offer my hopes for the best, but under these circumstances, and my hardly concealed views, I don’t feel able to offer this. Particularly as it supports a company that, in my view, pushes the limits of the spirit of the law, as well as promulgating a ‘papers please’ culture: and that’s just to ‘have fun’.

I’ve recently (ish) started using transmission as my torrent client; the change-over comes from my switching-things-off approach; instead of keeping ktorrent running on the laptop (and caneing my bandwidth), I can have something mainly work on the NAS which is always on (bar power-cuts/maint).

One of the things that I noticed was the apparent lack of renaming within Transmission (and the curious way earlier tickets are marked as duplicitous of later ones).

So, erm, I’ve written something that works for me. And hacked out the mailer-script to something a little cleaner— at least in my view.

The premise is that you’re using a POSIXish operating system — my NAS runs on Debian — and that all of your exports are within the /nas directory, and your torrents directory is /nas/torrents.

The changes needed are (with transmission not running, apparently) to include /path/to/post-download as the value for
script-torrent-done-filename in settings.json

You’ll need to echo in "/path/to/store/the/completed-file" to a file named as per the torrent (see your incomplete directory for that), but with “.move” appended; the rest should all happen automagically.

You might find it useful to chown the directories you’ll be moving things to that of the user running the transmission processes; I tend to setgid to my GID too.

The other file, mvtor, is one for doing a manual move, specify the torrent as an arguement; e.g., ./mvtor "ubuntu-10.04.1-alternate-i386.iso"