What is a JSON feed? Learn more

JSON Feed Viewer

Browse through the showcased feeds, or enter a feed URL below.

Now supporting RSS and Atom feeds thanks to Andrew Chilton's feed2json.org service


Bloggus Doekmanni

This is Bloggus Doekmanni, or Doekman's blog if you like. A bit of drivel about computer programming; some dead serious, some not so much...

A feed by Doeke Zanstra


One Closure Explanation

Permalink - Posted on 2021-04-11 22:00

Closures in programming languages are pretty intuitive to use. Understand what they are is actually a bit more difficult.

A closure is a function, but not a regular function. And to better understand closures, it helps me to understand regular functions. So first I explain how regular functions work. After that I’ll make the connection to functional closures.

Regular functions

When you call a regular function, a lot of things are going on. Under the hood a stack-structure is used. Things the function needs, like arguments and the return address, get pushed up this stack. This is a bit of a simplification. There are optimising compilers, but I want explain the principle here. If you want to delve into the nitty-gritty details, please checkout calling conventions on Wikipedia.

So how does this work, what gets pushed up our stack-for-algorithmic-purposes? Let’s explain this with a JavaScript example.

1: function test(a, b) {
2: 	var c = a + b;
3: 	console.info('The sum of %s + %s is %s', a, b, c);
4: }
5: test(1, 2);

The program runs the following steps.

  • Before calling test, the return address is pushed onto the stack. With this info, the function knows where to “return” to. So let’s push the address of line 5 onto the stack.
  • After that, the arguments 1 and 2 are pushed onto the stack. Now there are 3 items on the stack.
  • Now we actually can start running the function. This is done via a jump to the entry point of the function, like goto line 1. I will indent steps that are within the function call:
    • At line 1 there is nothing to do.
    • At line 2, there is some arithmetic, and the result is stored in the local variable c. Locals are stored on the stack too, so 3 will get pushed onto the stack, and we have 4 items on the stack.
    • Line 3 will print a line to the console, while referencing three items on the stack.
    • At line 4, the function returns. All local variables, arguments and return address are popped from the stack. The return address is used to jump back to (the end of) line 5.
  • And at that point there is no more code to execute, and our program has finished running.

At first glance this may seem a bit tedious and boring. Yet, some smart stuff is going on. When the function returned at line 4, all local variables, arguments and return address have been cleaned up. No need for garbage collection or reference counting. Very efficient.

And it works pretty well when calling other functions or even with recursive calls. The stack-size is the limit! Only a stack-overflow comes after that, but that is about it.

And yes, I left out handling return values, stack pointers and reference types. We do not need to worry about those for this explanation.

To summarise: arguments and local variables live on the stack. They only exist between function entry and exit. In regular functions, that is.

First class functions

Before we go to closures, it is important to know what first class functions are. JavaScript has them, but what defines them? At first you might think nesting functions is a property of first class functions, but it is not. Turbo Pascal supports inner functions, yet it doesn’t have support for first class functions.

When you can assign a function to a variable, the language supports functions as a first class citizen. It should be no different as assigning an integer or a string to a variable.

And when you can assign a function to a variable, you also can pass them around to other functions. Or let functions return functions or assign functions to data structures. This also seems reasonable, but Turbo Pascal can’t do this. For that one needs something extra.

Functional closures

So let us explore functional closures. By the way: closure, lexical closure or function closure; all the same thing. Take the following example:

function test_two(a, b) {
	var c = a + b;
	return function inner() {
		console.info('The sum of %s + %s is %s', a, b, c);
var result = test_two(1, 2);

The function test_two returns a nested function, which is assigned to the variable result. So you would expect to execute the inner function by running result(). And this indeed works in JavaScript.

But, when you think of the stack-based system of regular functions, as described above, something is off. The stack-algorithm says when the function test_two returns, the arguments a and b and variable c are popped from the stack, and would not be accessible anymore. A call to the inner function would refer to variables that don’t exist anymore!

Functional closures solve this. I like to think of a closure as function with a shopping bag that contains all the variables (state) which are necessary to perform its function. So the variable result contains a reference to the function test_two together with a bag with values for a, b and c.

When you would run var result2 = test_two(3, 4), the variable result2 will not contain the same thing as result. The reference to the function is the same, but this closure has its own bag of state.

How exactly this is implemented is not important for my explanation. Different languages do implement this behaviour differently, but the principle is the same. I found some more technical info about JavaScript closures on MDN, and also a blog about C# closures.

The Wikipedia article on closures has the line “Operationally, a closure is a record storing a function together with an environment.” in the introduction. That summarises the point of my explanation pretty well. I hope you now have a better understanding of closures. Let me know what you think.

Building Another Blog Engine

Permalink - Posted on 2021-01-30 23:00

I have some ideas for a blog engine I want to explore. Instead of creating a new website, I’m planning to make incremental changes/improvements to this website. Baby steps. That’s why I’ve named this project BABE: Building Another Blog Engine.

At the moment I’m using GitHub Pages for this blog. When I push a commit, GitHub schedules a task of transforming the source via Jekyll. And the resulting files live somewhere on a webserver. It’s convenient, but it doesn’t give me the control I need.

Step one is to build the website local, and push the resulting files to GitHub. I don’t want to create another repository for the build result, as this adds confusion. Fortunate, git has a nice solution for this. Create a branch for Jekyll’s build result, and tell GitHub Pages to serve the website from there.

I’m using the branch feature in an unusual way: it is not intended for merging into the main branch. And this way, the published source doesn’t show up in my blog’s history.

This branch should contain no files other than the build result. The obvious choice to do this would be to create a branch and remove all files. But why not use the powers of git and create a branch from the initial commit, way in the past!

initial_hash=$(git rev-list HEAD | tail -n 1)
git branch web-stek $initial_hash

We now have an (almost) empty branch. How do we get the output from Jekyll’s build command to this branch?

One way might be by having the ignored folder _site (which contains the Jekyll build result) on the master branch. Somehow bring the changes to the web-stek branch by switching branches. That would work with new files, but I don’t know how to transfer modified files this way.

Two alternatives come to mind. Either use two git repositories on the same machine in different folders. Or use an intermediate folder for Jekyll’s build result. For simplicity and less confusion, I’ve chosen to work with an intermediate folder.

In the example below, rsync transfers the build result to the /docs folder instead of /_site. I’ve changed the name, because /_site is not listed with the settings for the publication source on GitHub Pages. So /docs it is. The build process works like this:

jekyll build --destination "$jekyll_build"
git checkout web-stek
rsync -av "$jekyll_build" ./docs/

To finish up, I need to add the folders .jekyll-cache and .sass-cache to the .gitignore file on the web-stek branch. Also, some empty folders might be there, because git only works with actual files. Finally, you might want to remove the files from the initial commit in the branch.

I compared the result in /docs with GitHub’s copy by fetching the pages via wget -r https\://blog.zanstra.com. I only found one major difference. GitHub Pages would normalizes time-zone offsets to zero. That might be an issue for RSS readers, but I can live with it.

So after pushing all this, and setting up the publication source you are good to go. The next step is to automate publishing, but that has to wait for an other time.

UPDATED 2021-04-06: A bit mind-bending, but the publication source is considered to be in Jekyll format. How I learned this? I wanted the folder .well-known to be copied to my website, but Jekyll ignored this file. Hidden files are on the ignore list. Luckily there is an easy fix. Add a file at the root of your site called .nojekyll. The file can be empty.

Moving away from Server.app to Jamf Pro

Permalink - Posted on 2021-01-10 23:00

At Archipunt we manage around 100 macOS computers. We used to manage them via Server.app. Because of several reasons, we now are moving to an all-MDM solution with Jamf Pro.

To migrate the computers, we identified the following things to be fixed on these computers:

  • The registration to Server.app’s Network Account server needs to be removed
  • After this, some cleanup needs to be done in the computers local directory
  • Since password authentication was done via Server.app, the mobile account need to be migrated to a local account
  • Some computers used Server.app’s MDM solution. When still present, old profiles need to be removed
  • Most computers are registered via DEP, so enrolment needs to be renewed in order to configure the to be managed via Jamf Pro.

Because of COVID-19, most computers need to be migrated by the users themselves at home. Luckily, all users have elevated privileges. Otherwise, this migration would take a lot more time on our side.

To migration will be done by a custom application. The GUI part will be written in AppleScript because it is so well suited for this. The actual migration is done via a shell-script. I couldn’t have written this without the this script by Rich Trouton: I made it work with Open Directory (which Server.app users), instead of Active Directory. Another post that helped enormously was this check and fix for an expired Apple Push Service Delivery certificate.

I designed the shell-script via a “check-fix” strategy: I first check if something needs to be fixed, and if so, fix it. For example, first I check if the machine is linked to a network account server. If so, the link is disconnected. This way, the script can be run as often as one wants without breaking stuff. Most items about are coded via a check/fix pair. An added advantage is separate parts can easily be tested.

It’s also handy in the case the machine has to be rebooted. After that, just run the script again, and it factually starts where it was before the computer was rebooted.

The AppleScript front-end handles the start screen and the request for the elevated privileges. When an error occurs within the shell-script, the user is advised about this and the log file is shown in the finder. I reserved exit-code 199 via which the shell-script can tell AppleScript the computer needs to be rebooted.

To test the AppleScript without running the actual migration shell-script, I used the user defaults system. By writing a key/value pair to the application bundle domain (defaults write nl.archipunt.public.go-jamf-pro test -string "PASS") the test-application’s behaviour can be influenced. When the key is not present, the actual migration code will be used.

For source-code, please check out the repository on bitbucket.

About implementation: we first asked a small group of users to test the script, so we could detect common problems early and fix them. This worked very well. After errors from this initial group were fixed, most errors we experienced where plain user errors and could be handled via the phone very quickly.

The hardest error to fix was a user who just didn’t get the enrolment notification. Turned out, the user accidentally had switched on do not distrub. And also had no idea that option existed…

Pretty Print JSON with a Twist

Permalink - Posted on 2020-10-04 22:00

TLDR; You can pretty print JSON in a different manner. Try it here.

My second programming language was Turbo Pascal (first one GW-Basic). From a BBS-pal I learned a notation in Pascal that has the semicolon at the start of the line, instead of at the end of the line, as most people did. Sometimes I still apply the same formatting to JSON.

In this blog, I want to explore how such formatting can be done using Python and JavaScript. Just for the record: Pascal uses semicolons as statement separator, where as languages like C use them as statement terminator. JSON shares this behaviour with Pascal.

To make clear what I want, here’s an example JSON file in the desired layout:

{ "type": "pedant"
, "enabled": true
, "tags":
  [ "nag"
  , "prick"
  , "pin"
, "data":
  { "test": 123
  , "cooperate": null

At first it looks pretty weird. However, there is an obvious advantage. You can visually follow the nesting-level of the data-structure in any editor. And you never place a comma too many.

But why would one with a sane mind want to do this? Well, for starters because it is interesting. And also fight the orthodoxy. And it’s an ideal project to learn something new along the way.

Anyways. Python’s json module provides a dump-method to convert a native data-structure to JSON:

import json
data = {'type': 'pedant', 'enabled': True, 'tags': ['nag', 'prick', 'pin'], 'data': {'test': 123, 'cooperate': None}}
json = json.dumps(data, indent=2)

When you run this, it will print out the formatted JSON below.

  "type": "pedant",
  "enabled": true,
  "tags": [
  "data": {
    "test": 123,
    "cooperate": null

To better understand what’s happening, I visualized this formatting by adding colored borders.

"type": "pedant",
"enabled": true,
"tags": [
"data": {
"test": 123,
"cooperate": null

blue Providing the indent argument will kick-off the pretty printing. A positive integer will indent that many spaces per level. If you provide a string, that value is being used per indent-level. In this example I specified two spaces.

red If indent is specified, a newline will be inserted just before the indentation.

green You can also change the separator behavior. From the documentation: “If specified, separators should be an (item_separator, key_separator) tuple”. So by providing the tuple (';', ': ') for separators, one could create European JSON. Here I provide a semi-colon instead of a comma, which is obviously not valid JSON. So just be aware, you could produce invalid JSON with this option. However, when indent is specified, the tuple (', ', ': ') is automatically used, adding spaces after both separators.

So, these arguments were a bit of a disappointment. It didn’t seem possible to format the JSON the way I wanted it.

However, the API provides another option. We could specify a custom JSONEncoder subclass via the cls argument (you can inspect the code I linked to). This class is heavily optimized and caters for all sorts of requirements. I made a copy of the code, and made some modifications. It just seemed like a lot of work. The code is not really designed for reuse (for example: encoding value types, like boolean, are duplicated multiple times), so I decided to take a different route to a solution at this time.

Look at the color-coded JSON above. What if we replaced comma newline indent by newline indent comma? Yes, I’m talking string-replacement. Any newline in JSON can safely be recognized as whitespace, since newlines in strings are always encoded like \n (or when reading code, the newline \n is encoded as \\n).

The only tweak we need is there should be a little bit indentation after the comma (group \2 in the code below). The remaining indentation should be placed before the comma (group \1). For now, I assumed the indent to be two spaces. The ((?: )*) construct is to see the two spaces as one “thing”, so it could be matched multiple times. The ?: makes sure it is not remembered as a match. I do want to match multiples of two spaces here, hence the double parentheses.

import re
def my_json_pretty_print(json):
    return re.sub(r', *\n((?:  )*)(  )', r'\n\1,\2', json)

As a side-note: since this is a learning exercise, I tried to make the regular expression more readable by using verbose regular expressions. The idea about it whitespace (not used in special ways) is ignored, so you can use multi-line strings. You also can add comments to parts of the regular expression. However, I intend to match whitespace, but I couldn’t get it working. And because I also didn’t think the regular expression was more readable this way, I abandoned it.

If we combine the two code-snippets from above, we get the following output printed:

  "type": "pedant"
,  "enabled": true
,  "tags": [
  ,  "prick"
  ,  "pin"
,  "data": {
    "test": 123
  ,  "cooperate": null

Not bad. Not bad at all! When we compare this to the desired layout from above, we can note the following:

  • There are two spaces between the comma and the object-properties, instead of one. I think this is not wrong, but one space would be nicer.
  • The opening object- and list-characters ({ and [) are followed by a newline. This needs to be addressed.
  • The closing characters (} and ]) seem to be at the right place, so we are fine there.

First thing I want to fix is to support all sorts of indentations, and not only hard coded two spaces. I figured it is best to define a new function, which will call json.dumps, so there is only need to pass indent once. Addressing the issue with the extra space in the indentation: this is caused by the comma that is also functioning as an indentation character. Basically I need to delete the first character of the last indent, marked visually by the X:     .

I played a bit with parameterizing the indentation into the regular expression. String concatenation looked terrible. Formatting with {} and .format() wasn’t a good match either, because curly braces are also special characters in regular expressions. printf-style formatting (%s) is a good alternative, while the regular expression stays readable.

While playing with it, I discovered that the handling of the [- and {-characters are the same the ,-character, so I generalized the regex by changing , to a capturing group that matches all three characters, and add a back-reference to the replacement. The code now looks like this:

def json_stringify(obj, indent=2):
    if not isinstance(indent, str): indent = ' ' * indent
    result = json.dumps(obj, indent=indent)
    rx_indent = r'([,{[]) *\n((?:%s)*)%s(%s)' % (indent, indent[0], indent[1:])
    result = re.sub(rx_indent, r'\n\2\1\3', result)
    # Special case: remove inserted newline with top-level array or object
    return result[1:] if result[0]=='\n' else result

So, if we run the code now, we get this:

{ "type": "pedant"
, "enabled": true
, "tags": 
  [ "nag"
  , "prick"
  , "pin"
, "data": 
  { "test": 123
  , "cooperate": null

This is exactly how we want it. As always, there are some things to be desired:

  • Indentations with TAB-characters are not handled well. The code should only remove the first character of the last indent when it’s not a TAB-character.
  • Using regex special characters as indentation (like *) make the program fail: the indent-string should be converted to a valid regular expression string first.
  • The code fails when indent=0 with an IndexError: string index out of range on the code indent[0]. This is easy fixable by using slices, like indent[:1]

So that brings us to the following code:

def json_stringify(obj, indent=2):
    if not isinstance(indent, str):
        indent = ' ' * indent
    result = json.dumps(obj, indent=indent)
    if indent[:1]=='\t':
        r_indent0  = ''
        r_indent1N = re.escape(indent)
        r_indent0 = re.escape(indent[:1])
        r_indent1N = re.escape(indent[1:])
    rx_indent = r'([,{[]) *\n((?:%s)*)%s(%s)' % (r_indent, r_indent0, r_indent1N)
    result = re.sub(rx_indent, r'\n\2\1\3', result)
    return result[1:] if result[:1]=='\n' else result

So, that’s it. I made this code into a shell script, so it can be run from the command line (don’t forget to chmod +x it before you start it). So how does it perform?

$ ls -lh
-rw-r--r--   1 doekman  staff   4,9M 22 sep 17:08 big_trello_export.json
-rwxr-xr-x   1 doekman  staff   705B 22 sep 17:10 json-pp.py
$ time ./json-pp.py < big_trello_export.json > big_test.json
real	0m0.849s
user	0m0.734s
sys	0m0.069s
$ ls -lh
-rw-r--r--  1 doekman  staff   7,3M 22 sep 17:11 big_test.json
-rw-r--r--  1 doekman  staff   4,9M 22 sep 17:08 big_trello_export.json
-rwxr-xr-x  1 doekman  staff   705B 22 sep 17:10 json-pp.py

Not bad. Five megabytes within the second on my 2017 iMac. I expected worse. But to state the obvious: don’t use this in production.

As I mentioned at the start: I couldn’t have ended this quest without back-porting this code to the origins of JSON. The advantage of string substitution: it can easily be converted to JavaScript.

Converting the code to JavaScript was pretty straight-forward. Things worth mentioning:

  • JavaScript now comes with build-in JSON-methods (grandpa speaking)
  • I couldn’t find a native re_escape for JavaScript, so I used Dean Edward’s rescape from the (now ancient) base2-library. Substitutions in JavaScript are $1 instead of \1 in Python
  • To generate 4 spaces, in Python you would write 4*' '. I couldn’t find the JavaScript equivalent first, so I used new Array(1+4).join(' ') but then I found out you can use ' '.repeat(4) in modern browsers
  • Template Literals are still a bit to new to use everywhere, so I implemented the regular expression with string concatenation
  • Python slices [1:2] are calls to .substring(1,2) in JavaScript. Why can’t I remember this?

You can try the JavaScript version here. Thanks for reading!

Query Trello data with Postgres (JSON)

Permalink - Posted on 2020-04-02 22:00

Recently, I was working on a project to automatically create Trello-cards via their API. To do that, I needed some id-values from the Trello-board which are not shown in the user-interface. But where does one get these values?

Glad you asked. Every Trello-board can be exported to JSON, and the values I was looking for are available in this export. However, JSON gives you the data in one big piece of text. Wouldn’t it be handier to have it available as tabular data in a database?

So I created this git-repository to do just that (you will need the Postgres-database for this).

The table in which the JSON is stored, is modelled after an idea of Rob Conery which basically is: store the JSON in a column, along with another column that uniquely identifies the document. This unique value is also available within the JSON itself.

For example:

id, json
12, '{"id":12, "other":"data"}
13, '{"id":13, "other":"info"}

The actual table definition is different, but you get the idea. And don’t worry about storing the id-value twice: this small abstraction will make your life a bit easier.

On top of this document table, I created views for most used objects, like cards and lists. The pretty simple idea of “views on top of json” is pretty powerful. You get a lot of bang for your buck!

If you’re on a Mac, Postgres.app is the easiest way to setup a database, but Homebrew works fine too. As client, I use Postico which I really like a lot.

So if you have a postgres database setup, and have cloned the git project, it’s easy to create the database object via the terminal by typing:

make trello

This will create the trello schema and within the schema the document table in which the json will be loaded and a bunch of views. I’ve included a Trello export file, so you can load it by running the following command:

tool/loaddoc.sh data/simple_board_v1.json

loaddoc inserts the JSON-document in the database, unless it already exists. In that case it will update the row. To aid this upsert behaviour, I’ve added a trigger to the database schema that updates the id-column. Ideally, this would be handled by the tool itself. However: this is a minimum-imlementation. I couldn’t find anything to load json into the database from the command line.

A note about the id-column. Trello uses hexadecimal identifiers of 24 positions. That’s easy for computers, but a little harder for humans. So to ease filtering the views, I’ve added an integer-column with auto-numbering.

Below the screenshot showing the result in Postico with this Trello-board loaded; the same one as we just loaded. On the left you see the table and views. Above I’ve applied a filter, so I only see the lists from the first JSON document. All columns are type-cast. When you hover the image, the right pane is shown which also show type information.

Screen-shot of the Postico application Screen-shot of the Postico application, with right pane
The trello_list-view, with a filter applied (mouseover to display right pane)

Since I talked a lot about tooling: when you run cat .ok on the command line, you will find some often used commands on your screen. It’s called an ok-profile. The ok-bash-tool will make it easier to work with this. It also makes you smarter and more efficient.

So if you develop for Trello, you should really give this a try. And if you are working with JSON documents from other origins, I’m very much interested in your ideas.

Originally published at this gist on GitHub.

ok-bash and Python on macOS

Permalink - Posted on 2020-02-16 23:00

Development on Python 2 has been stopped, and the default availability of Python on macOS will be deprecated anywhere in the nearby future. Last week, Scripting OS X wrote about how to deal with this from an admin-perspective.

Here I would like to explain how we deal with this from the perspective of a small but very handy tool called ok-bash. This nifty tool helps you free brainspace by creating .ok-folder profiles for bash. You should check it out: it really makes you smarter and more efficient.

Since the default availability of Python on macOS is threatened, this article is written from a macOS point of view. But the tool itself works on all systems that run bash.

The tool is exposed via a bash-function, with a Python-backend helping with more complex tasks like syntax highlighting and other formatting. We need a bash-frontend, so the tool can work in the current shell environment.

We want to keep installation of ok-bash simple, so users can just clone the git-repository and initialize it via the shell startup file.

Since there are so many ways to manage Python, we decided to resolve the used python-binary that’s within the current PATH. This is termined via the command which python3 || which python, so Python 3 is used when available (the Python-backend works both in versions 2 and 3). Also, when only Python 2 is available (or if Python 3 is symlinked to python), ok-bash still works.

It’s also possible to manually override the path to Python, by setting an environment variable.

In the future, ensuring Python’s availability can be done via a homebrew recipe. An extra advantage of homebrew is it supports the use of virtual environments.

To the Point

Permalink - Posted on 2019-10-01 22:00

Already two internet aeons ago, secretGeek came up with some handy bash-aliases to ease navigating the folder hierarchy:

alias ..='cd ..'
alias ...='cd ../..'
#and so on

Two dots plus an enter navigate you one directory down¹, three dots do that twice, and so on. However, a single dot doesn’t go anywhere and gives you rudely some error message. No help there:

$ .
-bash: .: filename argument required
.: usage: . filename [arguments]

Turns out the . is a shell buildin command that requires at least a filename as an argument. How rude.

Wouldn’t it be handy and appropriate to print the current path when no argument is given? In that case, you could try adding the following to your .profile or equivalent startup file:

function . { if [[ $# -eq 0 ]]; then pwd; else source "$@"; fi }

Now a single dot will be more polite and will show the current path. Used with arguments, it works like before. So you still can use it to source scripts, like ok-bash.

¹) I've been told a hierarchy of folders/directories is a tree structure. When navigating into a folder, like cd some_folder, to my earnest knowledge one moves up. Branches mostly grow to the light where the sun is, which is up in the sky. So if you go back from said folder with cd .. one would move down. So secretGeek's aliases better be called down_down_down but maybe that has a less engaging ring to it. Also: the base of the hierarchy —one could go there with cd /— is called the root, which is a kind of odd, because the root is as branched as the part of the tree that grows in the air. And at the file system, it's only one place! Anyways, the proposed function neither goes up nor down, so at least we have no confusion there.

NetNewsWire 5.0 released

Permalink - Posted on 2019-08-25 22:00

From the Coding Guidelines of NetNewsWire:

Version Control
Every commit message should begin with a present-tense verb


Initial Add Add Add Add Remove Add Add Add Add Add Add Add Add Fix Make Add Unbreak Add Add Add Add Add Add Add Add Add Add Add Delete Add Remove Drop Add Add Add Add Add Add Comment Allow Put Make Update Make Show Wire Avoid Replace Add Add Ignore Add Write Update Remove Merge Generate Merge I Merge.

Provide If Add Merge Fix Merge Fix Merge Merge Update Merge Use Use Start Create Start Extract Start Make Make Get Start Add Turn Add Add Move Fix Add Add.

Commit Continue Move Add Work Continue Continue Delete Convert Convert Convert Convert Convert Remove Converrt Add Replace Set Begin Move Remove Set Continue Add Back Make Start Make Make I Rename Give Start Continue Create Add Make Do.

Get Make Get Attach Progress Make More Continue Make LookupTable. Using Yet More Merge Remove Start Checking Added Start Progress Make Finish Make Turn Make Decide Progress Make Fix Make Create Make Delete Progress. Implement Start.

Implement Create Make Make Make Make Continue Continue Make Continue Deal Get Make Use Call Add Delete Fix Make Marked Use Make Make Update Fix Make Rename Remove Add Prune Switch Save Update Move Merge Fix Update Merge Remove Update Move Update Fix Create Slightly Cache Use Delete Implement Update Update Make Update Move Create Delete Fix Get Move Delete Replace Continue Fix Continue Continue Continue Continue Continue Make Get Make Continue Continue Start Use Create Switch Simplify Update Start Remove Start Rename Do Move Make Fix Start Start Make Init Make Fix Make Init Fix Perform Edit Add.

Make Fix Fix Fix Fix Fix Fix Create Move Remove Make Create Make Rewrite Use Fix Fix Remove Require Add Use Use Send Move Update Keep Update Create Make Call Save Fix Avoid Avoid Set Set Simplify Save Comment-out Add Remove Update Update Include Fix Update Refactor Avoid Handle Handle Rename Update Use Show Make Add Implement Show Update Make Fix Create Update Update Define Handle Make Update Differentiate Add Use Removed Update Change Make Make Post Rejigger Send Rewrite Use Update Move Make Fix Update Maintain Update Update Start Create Switch Begin Clear Rename Update Move Fix Edit Put.

Flesh Craete Fix Create Start Do Start Fix Move Move Read Show Use Use Make Make Make Upgrade Upgrade Upgrade Upgrade Fix Make Switch Make Add Display When Mark Use Add Add Add Add Create Write Allow Write Check Add Remove Calculate Draw Close Start Show Draw Create Add Show Change Add Add Merge Rename Add Post Make Make Run Move Update Add Show Add Remove Add Skip Add Bump Add Add Change Add Change Bump Use Lighten Switch Set Decode Add Expand Set Bump Create Fix Use Create Add Add Add Add Fix Create Treat Update Make Use Add Delete Create Move Implement Remove Bump Switch Disallow Save Start Save Save Create Create Make Create Make Make Bump Make Update Set Darken Create Create Switch Create Make Make Replace Fix Make Create Make Use Fix Implement Use Update Show Return Add Bump Update Check Add Fix Fix Put Note Edit Avoid Log Normalize Print Log Simplify Log Add Start Adjust Pull Add Use Rebuild Add Fix Keep Add Make Parse Get Add Start Add Add Prefer Bump Add Add Add Fix Do Don’t Add Hide Add.

Avoid Update Update Bump Fix Save Use Make Start Add Fix Get Add Add Use Bind When Create Delete Fix Do Increase Fetch Set Show Create Refresh Make Use Update Merge Update Add Start Add Fix Continue Make Define Make Create Make Create Create Parse Fix Base Cache Update Clean Fix Remove Remove Get Increase Add Create Create Remove Remove Remove Add Add Add Make Use Add Draw Skip Add Delete Reuse Parse Remove Bump Update Merge Add Save Refresh Parse Make Add Prefer Test Add Support Add Parse Add Add Make Add Add Fix Add Add Add Add Make Implement Add Style Display Normalized Continue Progress Make Edit Implement Position Bump Update Add Add Update Fix Add Add Link Add Add Add Fix Add Add Fix Add Continue Add Fix Update Use Merge Set Add Merge Update Update Move Implement Merge Clean Move Validate Add Use Use Remove Merge Simplify Return Validate Popup Add Remove Create Improve Set Start Don’t Set Use Use Bump Remove Update Implement Go Add Use Make Bump Update Remove Remove Create Merge Make Continue Continue Show Update Add Update Fix Show Use Put Add Fix Use Update Add Add Show Make Fix Comment Bump Update Freeze Fix Make Show.


Move Move Add Show Fix Check Use Update Make Make Turn Create Remove Draw Add move Merge Merge, Bump Increase Make Start Check Remove Merge Make Decrease Make Update Continue basic Merge revert Merge Make Add Use Add Switch Make User Use Fix Increase Add Bump Change Update Update Add Add Create Add Make Use Use Bump Update Make Unbreak Create Scripting Merge Add revert Merge Add Remove Layout Create Create Make Continue Keep Adjust Remove Reopen Open Restore Hide Display Continue Increase Continue Set Share Create Send Extend Make Add Revise Refer Move Add Handle Use Update Merge Scripting revert Add Merge Remove Download Switch Remove Remove Back Use Remove Add Add Set Make Add Add Add Add Validate Set Merge Consider Register Merge Re-sort Create Build Add Add Create Change It Accept Remove Sender Merge Merge Merge.

Merge convert Merge Merge Start Merge Create Layout Change Add Wire Try Disable Make Delete Remake Make Remove Continue Get Add Add Move Make Remove Bump Update Reorder Remove Create Merge Merge Remove Show Make Show Change Update Restore implement Merge Merge Start Add Merge Add Pass Rearrange Add Use Add Create Add Make Add Add Remove Fetch support Merge Merge Fetch Rename Bump Update Remove Skip Make Remember Fix Make Skip Disallow Punt Make Create Implement Support Add Start Make Support Make Remove Update Bump Update When Skip Make Add Do Merge Hide When Make Update Make Create Delete Remove Add Add Use Use Use Remove Remove Use Create Use Create Use Don’t Use Add Remove Create Move Remove Move Move Fix Remove Move Darken Do Start Show Skip Add Mark Release Make Remove Rewrite Define Tweak Bump Update Give Tweak Draw Show Make Update Merge Switch Remove Create Add Turn Reenable Make Add Use Make Don’t Don’t Remove Replace Make support Remove Start Continue Use Fix Turn Reduce Lower Lighten Tweak Add Skip Make Turn Fix Match Adjust Use Make Turn Revert Hide Fix Make Make Deal Rebuild Bump Update Force Draw Reduce Save Bump Update Start Clean.

Support Merge revert Reset Merge Merge.

Add Merge add Merge better.

Merge Add Merge.

Add Remove Break Break Break Break Break First Merge use Break Break Rename Rename Add Add Add Add Build Update.

Add Get Get Make Add Use Nuke Fix Remove Remove Remove Use Update Merge Really Use Use Make Submodule Remove Submodule Start RSDatabase Remove Rename Move Delete Remove Include.

Update Add Unbreak Merge Oops Merge Add I Change Merge Fix Changes Merge Simplify Update Fix Use Unignore Use Use Use Use Update Use Update Use Use Use Use Use Use Use Use Use Update Update Update Add Update Update Use Update Merge Finish Use 🎉Start Continue🏡 Update Update Revise Merge.

Removed Rename Update Change Use Use Merge Switch Merge Merge Hide Allow Normalize Normalize Fix Make Update Stop Merge Don’t Rename Add Update Add Update Start Keep Use Merge Merge Manually Add Bump Bump Hide Add Add Fixed Fix Merge Merge Don’t Merge Removed Fixed Merge Merge Update Update Change Changed fixed Merge Added Merge Remove Update Bump Update Enabled Merge Enabled Merge Bump Fix Print Fix Give Update Implemented Start Update Rolledback Enabled Changed Enabled Stop Update Update Stop Give Store Stop Merge Don’t Write Merge Merge Changed Added Merge Merge Save Merge Store Store Make Store Store Store Store Skip Write Added On Remove Stop Delete Remove Made Update Skip Merge Modified Updated Disallow Include Register Create Made Made Remove Start Make Refactor Merge Normalize Register Validate Add Accept Make Get Validate Make Added Added Sorted Redo Start Improve Merge Merge Merge 5.0d7. Update Temporarily Fix Include Merge.

Update Continue Complete Fix Similar Merge Merge.

Remove Update Re-enable Make Allow Update Make Update Increment Install Clean Update Update Update.

Clean Update Merge Add Add Make Override Use Enable Add Add Update Update Get Do Disable Update Move Add Remove Add Remove Start Continue Make Remove Add Remove Add Use Rename Remove Make Move Fix Remove Remove Remove Make Add Add Add Add Make Make Add Remove.


changed Merge added Merge Start Merge Send Add Fix Update Run Don’t Start Add Add Add Update Update Add Fix Update Bind Add Switch Keep Update Update Fix Set Bump Close Add Bind Refresh Check Add Fix Switch Update Remove Enable Update Make Bump Tighten Reduce Further Update Bump.

Moving Remove Updates Update Re-sort Restore Make Set Start Update Refresh Bump Drop Update Use Make Add Implement Add Implement Bump added Merge Don’t Don’t Update Make Allow Queue Do Remove Remove Remove Skip Fix Fix Discard Bump Update Skip Stop Update Remove Remove Remove Remove Remove Fix Merge Move Merge Remove Get Delete Delete Simplify Simplify Simplify Rationalize Import Start Make Update Quiet Merge Create Update Remove Move Make Remove Make Place Move Create Add Start Start Differentiate Merge Change Fix Merge Work Merge Make Continue Start Continue Remove Make Make Continue Rewire Fix Merge. Update Make Remove Remove Make Update Make Update Update Implement Make Add Index Update Revise Create Show Add Update Remove Revert Bump.

Implemented added Use Remove Get Remove Remove Create Made Recalculate Merge Merge Remove Comment-out Rolled Merge Updated Merge Switch Merge Made Merge updated Merge made Merge add Merge Change Document Delete Create Update Continue Add Import Pull Change Remove Start Continue Update Make Make Skip Remove Remove Perform Do Give Pay Start Update Continue Make Make Move Finish Continue Continue Update Merge Store Update Convert Convert Post Convert Convert.

Remove Set Drop Merge Remove Removed Create Merge Updated Remove Remove Remove Remove Merge Made updated Changed Updated Use Remove Remove Remove Remove Remove Remove Update Move Fixed Merge Remove Merge Delete Fix Move Move Remove Remove Major Move set Added updated Remove Work Start Update Save Update Take Made updated Added added Added adding Merge cleared Attempted Design Removed Merge Added Fixed Combined Combined Fixed Implemented Implemented Updated Group Merge Update Update Update Update Merge Updated Established Adjusted Fixed Fixed Changed Cleaned Fixed Added Added Made Added Reduced Fixed Implemented Refactored Made Added Changed Renamed Added Made Implemented Add Remove Add Update Start Create Switch Change Fix Implement Make Fix updated Update Fix Remove Tweak Fixed Change Revise Merge Removed Restored Fix Change Prevent Create Change Update implement Move Change Refactor Encapsulate Implement Add Finish Hide Move Prevent Fix Removed Hide Change Add Implement Fix Make Make Skip Remove Change Add Rename Make Change Add Refactor Enable Comment Implement Change updated Fix Add Implement Updated Add Add Prevent Add Rework Update Change Remove Add Rollback Enable Add Added Made Remove Change Add Enable Tweak Add Add Change Tweak Hack Provide Delete Rename implement Fix Fix Force Fix Made Deselect Deleted Align Make Respond Fix Align Change Slightly Increase Changed Use Add Merge Update Add Merge Add Add Merge Add Change Merge Change Create Change Fix Merge Refer Implement Make Made Merge Set Add Fix Add Fix Reuse Merge Write Add Make Merge.

Rename Add Add Make Rename Rename Add Load Enable Delete Refine Make Make Prevent Implement Make Make Add Implement Implement Tweak Expand Prevent Show Update Updated validate Make Update Made change Add Add refactor Upgraded Pass Modify Rename Add Add Update Fix update make Add Added Fix Add Change Add Save added Add Keep Add Add Add added Update Add Fix Add Update Add Change Use Add Capture Refactor Add Restrict Move add store make Make Place Make Added Change Delete Set Change Add Put Stub Make Speed Use Set Update Merge Add Save Add Fix Delegate Make Save Change Scale Add Add Fix Upgraded Fix Fix Removed Fix Restore Update Add Start Update Add Add Add Rename Add Align Switch Prevent Retrieve Add Fix Scale Merge Update Show Fixed Disable Merge Move Make Remove Remove Add Disable Merge Remove Rename Dismiss Don’t Removed Disable Don’t Dismiss Rename Updated Improved Further Made Add Made Updated Set Add Moved Always Made Merge Allow Update Decrement Simplify Increase Merge Add Add Merge Make Make Updated Implement Make Make Remember Fix Update Remember Add Remember Make Update Update Send Fix Enhance Use Change Merge Remove Correct Use Cleaned Left Improved Removed UI Show Centered Merge Made Fixed Use Update Add Update Remove Update Merge Reformat Remove Add Set Merge Update Merge Correct Update Add Fixed Change Add Wrap Update Update Tweak Revise Add Fix Fix Add Tweak Change Make Update Change Merge Allow Enable Updated Updated Merge Refactor Correct Remove Google Use Point Enable Prevent Remove Handle Enforce Update Authentication Update Merge Rework Enable Corrected Remove Correct Refactor Rename Rename Add Ensure Modify Enable Handle Decoding Refactor Update Validate Fix Clear Remove Make Merge Merge Update Update Remove Provide Made Bump Update Fix.

Cleanup Cleanup Add Make Pop Add Prevent Fix Fix Fix Ensure Fix Remove Fix Move Revert Fixed Merge Resolved Fix Merge Fix escape Fix Merge NetNewsWire upgraded Fix Changed Add Update Make Updated Updated Change Request Show add Merge Update Update Unread Mark Convert Make Add Update Add Fix Dismiss Merge Added Merge Updated Revert Add Added Updated Updated Marking moved Add Use fixed Refactor adding trying fix Merge Add Merge adjusted Made Made Refactor Add Correct Add Update center add Add Merge Revert Add Added Merge Fixes Merge Need Updated Add Merge Merge submodules Merge Move Subscribing Rename Port Add Merge Opens Update Update Update Merge Move Add Add Disbale removed Merge Merge FIrst Change use Fixed Wiring Code Begin Update Update Only Update Merge Update Remove Skip Use Merge Merge Change Move Update Implement Cleanup Merge Merge Adds Merge Renamed Merge Updated some Merge Merge Merge Merge Renamed restores restores Restrict Corrects Merge Merge Changed Switches Adds Switches Rebrand Merge Update Ensure Reduce Update Make Support Low Set SF Merge SF Rolls Deletes Merge Implement Merge Retrieve Temporarily Move Force Navigate Correct Fix Fix Removed Move Temporally Add Change Rename Fix.

Move Run Merge Update Add Encapsulate Move Move Move Remove Change Merge Write Update Update Make Update Add Note Remove Update Merge Update Make Merge Use Update Update Update Mark Make Merge Remove Make Delete Make Make Make Make Make Make Merge Update Use Update Merge Create Merge Merge Add Merge Add fiddle Merge Merge Use Merge Create Merge Updated Use Remove Merge Merge Create Merge Create Create Make Merge Create merge Merge Add Prevent Added Add Fix Hook Change Show Fix Initial Fix Merge Make Merge Help Remove Merge Remove Automatically Rename Change Merge Show Merge In Restore Merge Remove Merge Merge Make Merge Remove Trying Get Merge make Merge rename CLean Skip Merge Integrate Check Merge Merge Merge revert Revert Try Exempt Merge Helpbook: Merge Add Merge Prevent Merge fix Fix fix Merge Merge.

Merge Refactor Make Add Remove Implemented Correctly Adjust Correct The Merge Merge Replace Merge Fix Fix Fix Remove Merge Made Keep Fix Wire Merge Fix Merge Enable Add Merge Fix Deleted Made Remove Fix Fix Merge Reformat Merge Refactor Fix Change Reenable Correct Removed Fix Fix Fix Helpbook: Merge Remove Merge Help Merge Change Merge Merge fix undo Bump Merge Add, Merge Merge Update Fix Merge Bump Merge Use Merge Break Merge Update Add Add Get Update Merge Bump Merge Update Bump Merge Now Merge Update Fix Rearrange Add Add Add Add Center Add Add Add Add Change Do Implement Add Merge Add Remove Move Add Merge Add Add Add Add Move Merge Commenting Fix Work Change fix Replace, Merge Start Port Change Optimize Remove Make Give Refetch Bump Merge Update Add Merge Hack Remove Add Fix Reload Center Center Center Bump Update Merge Bump Merge Add Leverage Merge add Change Add Update.

Congratulations to Brent and others. Enjoy NetNewsWire 5.0

Office for OS X icons

Permalink - Posted on 2019-08-16 22:00

Updated: added Big Sur icons.

OS X, now called macOS, was based on the NeXTSTEP operating system. And one innovation of this operating system was large full-color icons.

With the introduction of new icons this year with Office 365 (including the 2019 retail version), I felt the need to create this:

Office for OS X icons (original) Office for OS X icons (2021 Big Sur update)
Office for OS X icons (focus or hover to display 2021 update)

In 2001, Office v. X was released. It’s design was very distinct from Windows, and also very aqua-y. The maximum size was 256×256 pixels. The 2004 icons I found were oddly sized 128×128 pixels. The resolution of the 2008 and 2011 icons went up to 512×512 pixels. These were also the last OS X-specific icons for Office.

Starting with the 2016 version, the icons design of the Windows version was also used with macOS. And because of higher resolutions of displays, the dpi-count of the icons went up from 72dpi to 144dpi, making the 2016 and 2019 icons sized 1024×1024 pixels.

Haven’t had enough? At the excellent Version Museum they have all Windows icons of Word including screenshots. They also have a page on Excel .

On renaming a GitHub repository

Permalink - Posted on 2019-06-11 22:00

Did you know you can rename GitHub-repositories? There are some quirks though.

Check it out: https://github.com/doekman/You-are-so-lame

A third way of putting AppleScript into git

Permalink - Posted on 2019-05-05 22:00

There are currently two ways of putting AppleScript into git. First is just by adding the binary .scpt-files. You don’t get much benefit from using git, but it works.

The second way is to save your AppleScript source as text-files, and put these into git. You get advantages from git, but it is not as seamless as the first way.

Introducing osagitfilter

Now you can combine those two ways by using the osagitfilter utility. Technically, it’s a git filter that translates AppleScript’s binary format into the text-representation which then will be used by git’s internal workings.

See the repository’s readme for installation instructions. After that, you can put compiled script files, AppleScript applications and script bundles in git as if they where regular text-files. When you then clone this repository, the files are re-assembled bit-perfect.

Let me demonstrate this by an example. First create a git-repository:

git init osagitfilter-demo
cd osagitfilter-demo

Now create the AppleScript-file my_script.scpt in this folder with the following contents:

display dialog "What's your name" default answer ""
say "Hi there, " & text returned of result

Since we don’t want to add a binary file to git, we first need to associate the .scpt-extension with the osagitfilter (you need to explicitly opt-in every repository). This can be done by adding a line to the .gitattributes-file, connecting the .scpt-extension to the osa filter:

echo "*.scpt filter=osa" >> .gitattributes

Now the files can be added to git:

git add .gitattributes my_script.scpt
git commit -m Initial

Now let’s change the script by appending & ", I like your name" to the say command in the my-script.scpt file. Don’t forget to save.

Now when you run git diff, you can see the changes you made as you would with text-files. You don’t need to stick with the command-line: I can confirm it works with GitHub Desktop. It should also work with other GUI’s, but I haven’t tested this.

Not only AppleScript

The program is called osagitfilter, so you can also add JavaScript .scpt-files. There is also a feature that prevents you from accidently adding AppleScript-debugger files to git (a special file format used by the indispensible Script Debugger).

I hope osagitfilter will be useful tool. It will not completely replace the first two methods, but it’s nice to have an alternative to them.

Let me know what you think on twitter or at this thread on Late Night Software’s AppleScript forum.


Permalink - Posted on 2019-03-07 23:00

A while ago, I created a shell script called git-url. It prints the remote origin URL of your current git-repository (or exit 1 if no git-repo is found). If the URL is in git: format, it is converted to http: format.

Why would one write this? Because of open `git-url`.

This will open the GitHub- or BitBucket-repository in your favorite browser. I use it all the time!

Above works on macOS. Windows uses start instead of open, but I don’t know how it works with Windows Subsystem for Linux.

Updated 2019-12-04: you now can add a SHA1 hash to go directly to that commit in the browser (GitHub and BitBucket) like open `git-url 7593e1a`, which opens this link).

Networking with iOS via Ethernet

Permalink - Posted on 2018-12-30 23:00

This summer I went to Ecolonie, a very nice place in France. It is an idealistic place that doesn’t want to exclude anyone. So they don’t allow digital wireless connections like Wifi and Bluetooth, because this can make some people sick.

So I thought of connecting my iPad via Ethernet to internet. I’m not the first one to try this, but I wanted to share my experiences. I first bought the Apple Lightning to USB 3 camera adapter (you can get cheaper clones, but non-Apple periferals can suffer from some kind of compatiblity issue). Next was a powered USB hub from the Aukey, which already has an ethernet LAN adapter build in! So there’s no need to buy an external USB Ethernet Adapter.

This setup works (almost) perfectly with both iPhone 5 (iOS 10.3.3), iPhone SE and iPad Air 2 (both iOS 11.4.1). I didn’t test my original iPad, since I don’t have a camera adapter with a dock connector.

When connecting the setup, you’ll get an entry between Wifi and Bluetooth in the settings menu, where you can check your TCP/IP settings. When I plugged in my Apple USB Ethernet Adapter, the Ethernet menu shows two adapters (also one for the build-in Aukey ethernet). Like in macOS, it will prioritize the adapters automatically and choose the “best” one. I performed some speed tests, and I didn’t find significant differences between Wifi and Ethernet.

I found one downside which stops it from working perfectly. I switch on Airplane Mode to make sure no digital wireless signals were emitted. However, in that case for almost every app you open, you’ll get a warning to switch off Airplane Mode. Luckily these warnings can be dismissed without any problems, and bytes will happily flow between your device and the internet.

Other findings: I charged by iPad via a cable between the lightning connector in the Camera Adapter and the fast-charge port in the USB hub. This caused noise on the speaker I had connected to the iPad via the 3½mm audio jack. The noise disappeared when removing the charging cable from the USB hub and did not return when connecting to iPad’s charging adapter.

One final note on USB Ethernet adapters: naturally they have built-in MAC-addresses. But you can expect some complications on your network when you use a MAC-addresses to identify all allowed devices. Devices, like in Lisa’s tablet and Bart’s iPhone… D’oh!

Jump around

Permalink - Posted on 2018-09-12 22:00

There are more ways to navigate the file system in bash than I imagined when I still was using cmd.exe. Let me name some:

  • cd navigates to the home folder. Like cd /users/$(whoami) or cd "$HOME" but shorter.
  • cd - navigates to the previous directory. It’s like the zap-button on your television remote control!
  • cd ~/Doc tab-completion works like Visual Studio’s intelli-sense, but without the dropdown and it’s case sensitive. If there is one choice, it will complete it. If there is more, you need to press tab twice, and it will show a list.
  • pushd/popd/dirs save/load/show a directory to/from the stack. Especially handy in complex scripts.

However, I noticed I was always navigating to not that many different number of folders. I thought, wouldn’t it be handy if one could save/load folders to some kind of dictionary. I did extensive research and couldn’t find anything, so I created go. It was great. Typing go gh performed cd ~/prj/GitHub. To save the current folder under the name “here” would be performed by go here .. And go would show all the current stored definitions.

As I mentioned before, I performed thorough extensive research. I didn’t take it lightly. However. Apperently, the subject sometimes needs time to sink in. As today I learned from a guy I have known some time, that he created a tool called markjump. This is basically what I made, only for powershell. But his work was inspired by Jeroen Janssens’ shell script, from five years ago. And he’s a Dutchie too! How could I have missed that? And his code is 1500% better!

To complete my story line: Half a year ago I ran into the z-utility (can’t remember how), which has the description jump around. I was intrigued! I learnt descriptions of utilities are pretty important. And it turns out to be a pretty clever and helpful utility. It automatically remembers where you navigate to; no need to keep a list manually. And it will also keep statistics.

It uses regular expressions to find which folder you want to navigate to. And it’s case-insensitive, so z dow will navigate to ~/Downloads. No need to type a capital-D. When the regular expression matches multiple folders, it uses the statistics (how recent and how frequent) to decide where to navigate to.

I must say I was a bit hesitant to use it, because it seemed so… technical. In the beginning of the man-page it says something like “Tracks your most used directories, based on ‘frecency’..” But the “jump around” reeled me in. I never used my own go-utility, and only just now looked back. Thanks rupa!

Emoji's missing from emojis.wiki

Permalink - Posted on 2018-08-27 22:00

The following entries are supposedly missing from emojis.wiki. However, you could also read this article as a proposal for alternate descriptions of a part of the Miscellaneous Technical range of the Basic Multilingual Plane of the Unicode standard.

  • Chin on fist on arm
  • Super happy
  • Moustache concealing mouth
  • Disbelieve
  • Big surprise
  • Unsure
  • Talking frolicsome

If you have any corrections or addendums, please let me know on Twitter.

To vowel case

Permalink - Posted on 2018-07-07 22:00

There already was Sentence case, lower case, UPPER CASE, Capitalized Case, AlTeRnAtInG CaSe and Title Case.

And nOw YOU cAn dO vOwEr cAsE tOO!

function toVowelCase(text) {
  return text.toLowerCase().replace(/[aeiouy]+/g, match => match.toUpperCase());

Updated: a correct implementation of toVowelCase is locale-dependend. In English, the letter y is sometimes a vowel, and at other times a consonant. The above works for the Dutch language and some other languages. Your mileage may vary.

Updated: bookmarking https://headlinecapitalization.com and https://titlecaseconverter.com.

To refactor, or not to refactor

Permalink - Posted on 2018-06-28 22:00

The code iteration below is a contemporary comment on the use of refactoring.
We start with this code:

function isLeapYear(year) {
  if (year % 4) {
    return false;
  } else {
    if (year % 100) {
      return true;
    } else {
      if (year % 400)
        return false;
      else return true;

All code has to be unit tested, so we write this (normally I start with this before writing actual code, but you know…):

function unitTests() {
  describe("leapYear tests", function() {

And to stay very agile, I wrote some minimal unit testing “library” (individuals over tools, anyone?):

function describe(testName, tests) {
    var groupTitle = 'Name: '+testName+' - Start: '+(new Date).toISOString();
    expect.nr = 0;
    expect.errors = 0;
    if (expect.errors == 0) console.info('All %s tests OK', expect.nr);
    else console.error('%s of %s tests failed', expect.errors, expect.nr);

function expect(x) {
    return {
        toBe: function(y) {
            if (x === y) console.info("%s: OK 👍", expect.nr++);
            else console.error("%s: Error 👹", expect.nr++), expect.errors++;

The unit tests all passed. Can’t wait to start with refactor step 1: removing unnecessary nesting. One step at a time.

function isLeapYear(year) {
    if (year % 4) {
        return false;
    if (year % 100) {
        return true;
    if (year % 400) {
        return false;
    return true;

Yes…, much better! All tests pass. Refactor step 2, fold down into one if:

function isLeapYear(year) {
    if (year % 4 || (!(year % 100) && year % 400)) {
        return false;
    return true;

And for the heck of it, remove the unnecessary if as well:

function isLeapYear(year) {
    return !(year % 4 || (!(year % 100) && year % 4000));

Oopsy. The fourth test failed. Just a typo, no worries. And since there’s already too much negativity around, I’ll fix that in the next iteration, and while we are at it, move the negation to another function.

function isLeapYear(year) {
    return !isCommonYear(year);
function isCommonYear(year) {
    return year % 4 || (!(year % 100) && year % 400);

The committee unanimously decided: don’t rely on javascript’s truthiness, we want explicit code! So fixing that:

function isLeapYear(year) {
    return !isCommonYear(year);
function isCommonYear(year) {
    return year % 4 != 0 || (!(year % 100 != 0) && year % 400 != 0);

Refactor step 6, the overload of negativity was back. Fixing:

function isLeapYear(year) {
    return !isCommonYear(year);
function isCommonYear(year) {
    return year % 4 != 0 || (year % 100 == 0 && year % 400 != 0);

Refactor step 7 should be a charm. I still don’t like the code at all, so f$*# it, just use the native function:

function isLeapYear(year) {
    return new Date(year, 2 - 1, 29).getDate() == 29;


Code can also be found here on GitHub.

Jekyll template for JSON Feed

Permalink - Posted on 2017-05-18 22:00

When I learned about JSON Feed, I immediately liked the idea. It’s like RSS/Atom, with the good parts (it’s de-central) and without the bad parts (no more XML, well thought of standard attributes).

I’m not to sure how it will be used in practise, but it was easy enough to create a Jekyll template (source on GitHub). I learned it’s handy to add a base-tag to the content_html-attribute, after trying this app.

And I made myself a nice square logo too.

JSON, the game #2: The parsings

Permalink - Posted on 2017-05-13 22:00

For some reason, I imagined the next step in the development of the game was to make the ‘block’ move. Like up and down, and sliding to the left. For agile reasons, I want to make progress that is reflected and visible in the software. So no spending time at writing libraries first. You probably end up writing stuff you don’t even need. But when starting, I realized the partial-JSON part needed to be worked on first.

To display the partial-JSON, I need a model of that JSON. That model should be capable of annotating the partial-JSON with grammar issues (like missing bracket/comma, incomplete strings or just plain errors). Indentation should be addressed too. Also some dimension-data is needed, so we can determine whether the block can move up/down and when it has reached the partial-JSON surface (sorry, I can’t stop thinking in planet metaphors).

When pondering on these requirements, I realized a tokenizer/parser was necessary. A library! Not very agile, right? Perhaps skipping the grammar issues first and add it later would have been a wiser choice. We will never know. Writing a parser does takes some time, but it sure is fun.

First I started writing a tokenizer. And since it’s not your average code, I used the most annoying coding convention I could think of. I learned this style with Turbo Pascal in my BBS days. Years later I discovered you can use it with languages like JavaScript, CSS, C# and even PL/SQL. But it’s too annoying for everyday use. Not only do you get The Eye from your colleagues; every editor I’ve tried actively works against you by messing up the indentation. Again, and again. With. Every. Single. Edit.

The tokenizer splits the JSON into tokens like a string, boolean value or comma. With the parser we then can reason about tokens instead of characters, which makes life much simpler. Because we work with partial JSON, the tokenizer is made to recognizes unfinished strings and other values. Every token has an is_complete-attribute, so when rendering we can show if a brace is missing, or a string is unfinished.


The parser was a dragon to me. I made a lot of diagrams to create a state machine, see above, and played with GraphViz (which is pretty cool, and a little weird). Finally I thought the state machine was right, and started coding. After writing some unit tests and identifying some failures, I did realize I went in the wrong direction. A state machine is apparently not the best solution to parse JSON and the answer was right in front of me. While I was focussing on the nice flow diagram at json.org (which I coincidently also knew from the Turbo Pascal manual; I know, grandpa speaking), the program I needed to write was listed as an inset on the right of that same web page! object := {} | { members }… Elementary, dear Watson…

A well, adventures in failure. Start all over, and everything will be fine. At the moment, I have a validating parser (source here). It says whether the JSON is correct or not. There are still loose ends, which I will fix when needed. For example, I ignored Unicode for a big part and focussed on ASCII for now.

On a side note: I also wanted to check out those new EcmaScript 2016 features, since every browser seems to be supporting them. It’s nice to see the language is evolving, after multiple timespans of stagnation. The best feature IMHO is string interpolation.

Next step is adding code to the parser for annotating tokens, and insert missing-tokens. And render the thing in HTML naturally.

JSON, the game #1: Bootstrapping it

Permalink - Posted on 2017-04-29 22:00

Around a year ago, I had the daft idea to create some nerdy JSON-game. I had good fun thinking about some concepts, made some notes, and that was about it. Until now that is. The game: It’s coming… It has already started. To keep me focussed, I will keep a journal of the creation, and you’re reading the first part.

I got the ideas from series like the Brent Simmons’s Vesper Sync Diary and Jon Skeet’s Reimplementing Linq to Objects. I found them enjoyable; a bit of a nerdy soap opera. And now is the time for me to get of my comfy sofa and produce some soap of my own.

The idea of the game: JSON tokens are coming in, and you need to place them correctly. The initial form of the game will be Tetris-like with your head on your right shoulder: the tokens come in from the right and go to the left (alternative gravity you can think). However, this form might change from level to level (I’m thinking of the partial JSON document as a starship, crashing into JSON stars/tokens as it cruises through the JSON universe).

The first step is to create some user interface mockup. A guy gotta start somewhere. My weapon of choice is HTML/CSS. At the top we have a dashboard providing some game details. Unformatted JSON will do for now. The playing field (where the partial JSON is) will have a fixed width, while the height might grow during gameplay when tokens are inserted between lines.


Positioning is done with the em and ch CSS units with a position relative/absolute container-combo, so I can position things absolute within the relative container. Every character is one ch and every line is one em. One little problem though: em is font-height, not line-height. And a line-height of 100% just doesn’t look right. But there is no such thing as a line-height unit. (Or is there?) This is easy fixable with a calculation, but line-height will appear in multiple places in CSS.

Also some color coding needs to be added, so we can identify unbalanced grouping tokens, missing separators, incomplete tokens and downright errors. That’s why I’m not using a pre-element. I don’t have a definite idea about gameplay, but some ideas come to mind. An obvious solution for an error would be “Game over”. This can also be postponed by counting “lives”. The error will vanish and you loose a life. Or an error could be cleared with the incoming JSON token \b. That choice will depend on the level. But I digress.

This is not all markup we need (we will need a dialog-box, for the start and end of a level), but it will do for now. I put the source-code on GitHub, and every instalment will point to a commit so you can see how this game comes to shape. You can also “run” the game here (nothing to play for now).

I hope to find a good balance between enjoyability, a bit of fun and some technical remarks. And I like to hear your thoughts on Twitter.

The next instalment will be about moving those incoming tokens.

Only my warnings

Permalink - Posted on 2016-09-07 22:00

Don’t you hate to be put in a situation like the following?

You are assigned by your company to add some functionality to some software at some client. The client has standards and practices in place stating you can’t checkin code that produces warnings, only to find out that a clean checkout produces loads of warnings when building!

But you can’t fix 😢 those warnings. You, the super star programmer™ (whose boss is getting paid big dollars for high quality, virtually bug-free software), cannot get rid of those pesky lines of inability…

For starters, why judge those warnings in the first place. You’re just the new guy, you don’t want to get into any trouble. Also, by solving warnings, you may introduce new bugs (even though you’re super competent, after all creating bugs is just a matter of probability). Nobody wants that! And finally, you’re a professional. The client is only paying for the new functionality, not for you to clean up the place. And about your collegues, you don’t want to hurt anyones feelings. So in short: the man says no. No way fixing those warnings, José!

Situations like these inspired us to envision a new kind of solution to those issues at hand. Let me introduce to you, without further ado, the Visual Studio extension Only My Warnings.

When your solution is under source control a new list button is added to the Error List pane, with the options All, Only Mine and Local Only. With the option Local Only selected, the pane will only show errors, warnings and messages caused by local changes, i.e. by you! All other messages are hidden, so you don’t have to deal with them. Victory!

screen shot

The extension doesn’t get in the way; it only adds this one list button. When selecting Only Mine-option, the tool will gather all messages from your checkins, and show only the messages caused by you. And it follows branches, so you can focus at the feature that is imporant at any given moment.

Now you can be productive, and still have a chat with your collegues without having an argument! While using this extension, you will contribute to your team’s productivity. You will be a team productivity enabler! That’s how 10× developers are bred! In a way, only my warnings contributes to world peace.

For now, the extension is still in its early alpha phase. If you want to participate in the beta-program, or want to get notified when the extention will be released to the Visual Studio Gallery, please fill out this form.

(this is a blog post in the series: I’m perfect, communicate better with tools)

A console.log() adventure; pimped

Permalink - Posted on 2016-03-06 12:45

Over a year ago, my interwebfriend secretGeek created a thing. I created a fork, and pimped it into this (and copied some of his blog lines too). I urge you to go and try it out before you read on. Unfortunately you’ll need to be on a desktop computer, not a mobile phone.

Go there now.

In case you don’t have a desktop computer anymore (woah, futuristic!), or have already tried it out, I’ll give some spoilers and discussion now.


Actually, I’ll give some space before the spoilers. Scroll now for spoilers.

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

↓ spoilers ahead ↓

Here is secretGeek’s blog-entry. When playing the game I had loads of fun. Keeping a map on graph paper, just to make sure the map is deterministic and doesn’t use some random numbers. After some more playing, I started looking at the code and wanted to tinker with it. So I forked it. I had some wild and interesting ideas.

But somehow, nothing happened.

I guess I was overthinking it. A year later, I remembered one time at work while being bored, I created the so called mini-webventure. I didn’t think too much back then, so why do it now?

So I just started. After running around in the forest (in the game, that is), I got tired of entering parenthesis. And with properties around in all browsers nowadays, there’s no need for it. So instead of typing n(), you now can enter n to go north.

An other thing: in a JavaScript game, I don’t consider inspecting and modifying variables cheating. So I prevented that with an iife and some help of strict mode (to eliminate global variables).

Multiple lines are now returned after entering a command, so they are displayed in the same font and color. And deaths are a bit more dramatic now. You should really try it, if you haven’t already…

After dying too often, implementing load/save was next. First I just saved the game state, but then I realized this was another way to cheat, so now only the entered commands are stored. Then I realized, a demo-mode is very easy to implement. Couldn’t resist it, but it’s hidden somewhere in the game.

I didn’t want to change the map or story. Just wanted to share the fun I had, both playing and pimping the game.

First post!

Permalink - Posted on 2016-01-09 19:30

OK, I started yet another blog.

Here are the predecessors (and thank my public backup service for most of this):

I also have a Tumblr micro-blogging thing called dd3v.