Categories
Uncategorized

JSON Validation and Type Driven Development

In my personal projects I have fallen in love with solving my problems via Type Driven Development.

Given a language has static types, generics, and first-class functions it hits the sweet spot for this kind of development. The only real requirement is first-class functions because it is an application of Lambda calculus principles.

The Problem with any

Typed languages provide safety. If the developer uses an API incorrectly, the computer will yell at them.

type Product = {
  readonly name: string
}

function createProduct(name: string): Product {
  return { name };
}

createProduct(5);

When calling createProduct with name of something other than a string the computer cries out:

Argument of type '5' is not assignable to parameter of type 'string'.

A problem I want to solve in one of my side-projects is JSON safety. Take Product as an example. When serializing it with JSON.stringify and then parsing it with JSON.parse, the type is lost:

type User = {
    readonly username: string
}

function renameUser(name: string, user: User): void {
   // implementation left blank
}

const product = createProduct('some product');

renameUser('some user', product);
renameUser('some user', JSON.parse(JSON.stringify(product)));

The second call to renameUser shows no error. The first call to renameUser shows:

Argument of type 'Product' is not assignable to parameter of type 'User'.
  Property 'username' is missing in type 'Product' but required in type 'User'.

If we write the unit test I’m confident we can prove that product and JSON.parse(JSON.stringify(product)) are deeply equal.

The problem is that JSON.parse() returns any (in TypeScript and Flow).

A similar problem exists in all of the languages I have come across:

  • Java org.json.JSONObject and org.json.JSONArray
  • Swift/Objective-C have JSONSerialization/NSJSONSerializalion
  • PHP’s json_decode

Going from binary data to native object is inherently unsafe. When the JSON data comes in from an external system – like a REST API – the risk is real.

A Band-Aid

In a language like TypeScript or Flow the straight-forward way to safely deal with JSON values is through type refinement.

This results in an increasing number of type guards as different members within the any type are accessed. Assuming your chosen REST API layer does JSON marshaling for you:

const result = await api.get('http://example.dev/api/people');

if (result && result.people && Array.isArray(result.people) {
   people.map(person => {
      // more runtime type refining 💩
   })
}

If both client and server are both under your control, or you feel somewhat confident enough in the REST API maintainers, one might feel brazen enough to force the situation:

type PeopleResponse = { people: Array<Person> };

const result: PeopleResponse = await api.get('http://example.dev/api/people');

// go along your merry way until your Runtime errors start popping up

This is madness. It assumes type safety when there isn’t any. Unfortunately, this is what I see most often in projects at work.

The prospect of writing lines and lines of type refinements for every possible JSON structure for every API response is a lot of work. In my “toy” project I already have 21 different REST API calls with varying shapes of responses and that’s only going to grow.

Can I write a JSON validation layer that’s as declarative as creating custom TypeScript types?

Let’s give it a shot.

Defining our Validation Types

Time to start practicing Type Driven Development.

What is Type Driven Development? Start with types, then write implementations to satisfy the type checker. It’s like Test Driven Development, but you don’t even have to write the tests.

Our current problem is pretty clear. We need a way to write functions that validate some JSON any type. That means we need a function that accepts a single any type as its input.

But which type does it return? That should be up to the implementation of the validation, and at this point, that implementation doesn’t exist. So we’ll use a generic type to stand in its place:

type Validator<T> = (value: any) => T;

This states that a Validator<T> is a Function that accepts a single any and returns a T.

This makes sense for success cases, but what about failure cases? What happens when validation fails?

At this point there are two options to deal with failure:

  • throw an Error
  • return a Union type to indicate success or failure modes.

Common usage of a Validator<T> expects failure. Using throw might feel simpler at the implementation level, but it forces the user of the Validator<T> to take on that complexity. TypeScript’s (or Flow’s) Union types allow for safe handling of success/failure modes.

Here’s what a Union type API looks like:

type Success<T> = {
  readonly type: 'success'
  readonly value: T
}

type Failure = {
  readonly type: 'failure'
  readonly value: any
  readonly reason: string
}

type Result<T> = Success<T> | Failure;

type Validator<T> = (value: any) => Result<T>;

This looks like the complete set of types for a “validation” API. A function that accepts any thing and returns Success<T> or Failure. The Success<T> boxes the typed value with the refined type. The Failure contains the original value and the reason that validation failed.

Let’s write our first validator:

const isString: Validator<string> = (value) => {
    if (typeof value === 'string') {
        return { type: 'success', value }
    } else {
        return {
            type: 'failure',
            value,
            reason: 'typeof value is ' + (typeof value)
        };
    }
}

With tsc and jest we can confirm that both type checking and runtime behavior match our expectations:

describe('isString', () => {
    it('succeeds', () => {
        const validator: Validator<string> = isString;
        const value: Result<string> = validator('yes');
        expect(value).toEqual(success('yes'));
    })
});

The remaining non-container (Array, and Object) types are equally as trivial. And to make things a little more convenient we can make Success<T> and Failure factories:

function success<T>(value: T): Success<T> {
    return {
        type: 'success',
        value,
    };
}

function failure(value: any, reason: string) {
    return {
        type: 'failure',
        value,
        reason,
    };
}

Now isString, isNumber, isNull, isUndefined, isObject, isArray, isUndefined and isBoolean can all follow this pattern:

const isNull: Validator<null> = value =>
    value === null
        ? success(null)
        : failure(value, 'typeof value is ' + (typeof value));

With each basic case we can write the corresponding set of tests to confirm the runtime characteristics and the static type checker’s ability to infer types.

But JSON is more complex than these base types, and our TypeScript types even more complicated with nullables and unions.

We need to be able to combine these base cases into something that can address our real world needs.

Combining Simple Types to Make Complicated Ones

Optional types in TypeScript and Flow are a Union type of null or some type T.

type Optional<T> = null | T;

If we wanted to validate an optional type our validator’s type would be Validator<null|T>.

An optional string validator would have the type Validator<null|string>. We have a Validator<string> already, so perhaps we can utilize that.

const isOptionalString: Validator<null|string> = value => {
    if (value === null) {
         return success(null);
    }
    return isString(value);
}

This works fine, but the idea of writing each isOptionalX sounds boring. And TypeScript types can be more complex than null|T. They can be string | number or any other set of unions.

Since we’re playing at leveraging Lamda calculus concepts, we can lift ourselves out of the minutiae of Validator<T> implementations and start working with validators themselves.

Given two different validators Validator<A> and Validator<B>, can we use what we know about validators to create a Validator<A|B>?

Using Type Driven Development, let’s stub out the function signature:

function oneOf<A,B>(a: Validator<A>, b: Validator<B>): Validator<A|B> {

}

At this point tsc is upset:

A function whose declared type is neither 'void' nor 'any' must return a value.

What should we return? A Validator<A|B> is like any other validator in that it accepts a single any argument. In Type Driven Development style, let’s return a function since that’s what it wants:

function oneOf<A,B>(a: Validator<A>, b: Validator<B>): Validator<A|B> {
    return value => {
    }
}

Now tsc says:

Type '(value: any) => void' is not assignable to type 'Validator<A | B>'.
  Type 'void' is not assignable to type 'Result<A | B>'.

Our function isn’t correct yet. It has no return value (void) but a Validator<A | B> needs to return a Result<A | B>.

We now have all of the inputs we need do do that within the scope of this function. All we need to do is use them:

function oneOf<A,B>(a: Validator<A>, b: Validator<B>): Validator<A|B> {
    return value => {
        return a(value);
    }
}

Now tsc is happy, but does it have the runtime characteristics we want?

describe('oneOf', () => {
   it('succeeds', () => {
      const validator = oneOf(isNumber, isString);
      expect(validator('a')).toEqual(success('a'));
      expect(validator(1)).toEqual(success(1));
   )};
});

What does jest think:

    expect(received).toEqual(expected) // deep equality

    - Expected  - 1
    + Received  + 2

      Object {
    -   "type": "success",
    +   "reason": "typeof value is number",
    +   "type": "failure",
        "value": 1,
      }

It failed with the number value as it should have, because we didn’t use both Validator<T>‘s.

function oneOf<A,B>(a: Validator<A>, b: Validator<B>): Validator<A|B> {
    return value => {
        const result_a = a(value);
        if (result_a.type === 'success') {
            return result_a;
        }
        return b(value);
    }
}

If Validator<A> succeeds, we return a Success<A>. Otherwise return the result of Validator<B> which is Success<B> | Failure.

We’ve written a function that accepts two Validator<T> types and returns a new Validator<> by combining them. We wrote a combinator.

I have so far failed to create a variadic version of oneOf that can take “n” Validator<T>s and infer the union Validator<T1|T2|Tn> type. This means we need to use multiple calls to oneOf to build up inferred union types:

const validator: Validator<null|string|number> = oneOf(
    isNull,
    oneOf(isNumber, isString)
);

Since nullable types are so common – and because it’s so easy to do given our APIs – we can use oneOf to make a convenient combinator that takes a Validator<T> and turns it into a Validator<null | T>. I’ll name it optional.

Definition:

export const optional = <T>(validator: Validator<T>): Validator<null|T> =>
    oneOf(isNull, validator);

And in use:

import { optional, isNumber } from `./validator';

const validate = optional(isNumber);

validate(1); // returns Success<null | number>;
validate(null); // returns Success<null | number>;
validate('hi'); // returns Failure

Again, we’re using a combinator to build up a complex Validator<T> without actually implementing any new Validator<T>s.

We can do the same thing to build Object and Array validators.

TypeScript’s Mapped types

The ideal API for validating should be as terse and declarative as a custom TypeScript type. Here’s a somewhat complex type:

type Record = {
  readonly name: string
  readonly owner: {
    readonly id: number
    readonly name: string
    readonly role: 'admin' | 'member' | 'visitor'
  }
}

This is my ideal API:

const validateRecord = objectOf({
    name: isString,
    owner: objectOf({
        id: isNumber,
        name: isString,
        role: isValidRole,
    });
});

The combinator we want to make here is objectOf. It will take a plain object who’s keys point to values of Validator<T>s and returns a Validator<Result<{...}>> that matches the shape of the validator.

In TypeScript we can infer this type using Mapped types. One of the examples looks similar to what we want:

Now that you know how to wrap the properties of a type, the next thing you’ll want to do is unwrap them. Fortunately, that’s pretty easy:

type Proxify<T> = {
  [P in keyof T]: Proxy<T[P]>;
};

function unproxify<T>(t: Proxify<T>): T {
  let result = {} as T;
  for (const k in t) {
    result[k] = t[k].get();
  }
  return result;
}

In terms of our domain we want to map the keys K of some generic object T into validators that validate the type at key K in T.

export function objectOf<T extends {}>(
  validators: {[K in keyof T]: Validator<T[K]>}
): Validator<T> {

}

So far what does tsc think:

A function whose declared type is neither 'void' nor 'any' must return a value.

Time to implement the combinator:

  1. Declare an instance of validated T
  2. Iterate through the keys of the mapped validators.
  3. Validate the value at value[key] with its corresponding validators[key].
    1. If Success<T[K]> set validated[key] = result.value
    2. If Failure return the Failure
  4. return success(validated)
export function objectOf<T extends {}>(
  validators: {[K in keyof T]: Validator<T[K]>}
): Validator<T> {
    let result = {} as T;
    for (const key in validators) {
        const validated = validators[key](value ? value[key] : undefined);
        if (validated.type === 'failure') {
            return validated;
        }
        result[key] = validated.value;
    }
    return success(result);
}

Now for a test:

describe('objectOf', () => {
  it('validates', () => {
    const validate = objectOf({
        name: isString,
        child: objectOf({
           id: isNumber
        }),
    });

    const valid = {
        name: 'Valid',
        child: { id: 1 },
    };

    const invalid = {
        name: 'Invalid',
        child: { id: 'not-number' },
    };
    
    expect(validate(valid)).toEqual(success(valid));
    expect(validate(invalid)).toEqual(failure(invalid, 'typeof value is string' ));
  });
});

And both tsc and jest are happy. Not only does it validate as expected, but it also infers the shape of the value:

Screen capture of Visual Studio Code showing the inferred shape of validate.

It knows that this particular use of objectOf creates a:

Validator<{name: string, child: {id: number}}>

Which returns a Result<T> type of:

Result<{name: string, child: {id: number}}>

An example in action:

const validate = objectOf({
    id: isNumber,
    name: oneOf(isString, isNull),
    role: oneOf(isNull, objectOf({
        type: isString(),
        groupId: isNumber()
    }))
});

let result = validate(JSON.parse('{"name": "sam", "id": 5}');
if (result.type === 'success') {
    /**
     * result is Success<{
     *   id: number,
     *   name: string | null,
     *   role: null | {type: string, groupId: number }
     * }>
     */
    result.value.name // null | string
    result.value.role // null | {type: string, groupId: number}
} else {
    // Failure
    throw new Error(result.reason);
}

If you already have a type you know you need to validate for, you can use it as the generic argument to objectOf and tsc will enforce that all of the keys are present:

type Record = { id: number, name: string };

const validate = objectOf<Record>({});

The tsc error shows:

Argument of type '{}' is not assignable to parameter of type '{ id: Validator; name: Validator; }'.
  Type '{}' is missing the following properties from type '{ id: Validator; name: Validator; }': id, name

It knows a validator for the Record type needs an id validator and a name validator.

It even knows which type of Validator<T> it needs:

const validate = objectOf<Record>({
    id: isString,
    name: isString.
});

id in Record has a type of number, but isString cannot validate to number:

(property) id: Validator
Type '(value: any) => Result' is not assignable to type 'Validator'.
  Type 'Result' is not assignable to type 'Result'.
    Type 'Readonly<{ type: "success"; value: string; }>' is not assignable to type 'Result'.
      Type 'Readonly<{ type: "success"; value: string; }>' is not assignable to type 'Readonly<{ type: "success"; value: number; }>'.
        Types of property 'value' are incompatible.
          Type 'string' is not assignable to type 'number'

You can see how it worked out that the id validator of isString does not return a Result<T> that is compatible with number which is the type of Record['id'].

One last thing to make use of objectOf a little nicer. When it iterates through the keys of the validators and reaches a Failure type, it returns the Failure as is. This resulted in a somewhat opaque failure reason:

    const invalid = {
        name: 'Invalid',
        child: { id: 'not-number' },
    };
    
    expect(validate(invalid)).toEqual(failure(invalid, 'typeof value is string' ));

The "typeof value is string" message failed because invalid.child.id was a string, not a number. Given we know which key was being validated when the Failure was returned, we can improve the error message:

function keyedFailure(value: any, key: string | number, failure: Failure): Failure {
    return {
        ...failure,
        value,
        reason: `Failed at '${key}': ${failure.reason}`,
    };
}

Now the failure in objectOf can be passed through keyedFailure before returning:

for (const key in validators) {
    const validated = validators[key](value ? value[key] : undefined);
    if (validated.type === 'failure') {
        return keyedFailure(value, key, validated);
    }

The improved error message is now:

"Failed at 'child': Failed at 'id': typeof value is string"

The value at .child.id was a string, and that’s why there’s a failure. Much clearer.

We’re an arrayOf implementation away from a fully capable JSON validation library. But before we go there, we’re going to detour into more combinators.

Combinators

In Lamda calculus a combinator is an abstraction (function) whose identifiers are all bound within that abstraction. In short, no “global” variables.

If we consider the behavior of Validator<T> and how it returns one of two values Success<T> or Failure a natural branching control flow reveals itself.

In our example uses of Validator<T> instances, to continue using it, the next step is to first refine it by checking result.type for either success or failure.

Given how common this pattern is, we can write some combinators to make them slightly easier to work with.

In most uses of Validator<T> we want to do something with the boxed value of the Success<T> case of Result<T>.

This looks like:

const result: Result<Thing> = validate(thing);
if (result.type === 'success') {
  const value: Thing = result.value;
  // do something interesting with value
}

The pattern here is refining to the success case, then using the success value in a new domain. So if the user of validate had a function of type:

(thing: Thing) => OtherThing

It would be nice if they could forego the extra refinement work. We can define that pattern in a combinator.

We want to map the success case into a new domain.

function mapSuccess<A, B>(result: Result<A>, map: (value: A) => B): B|Failure {
  if (result.type === 'success') {
    return map(result.value);
  }
  return result;
}

And in use:

function isAdmin(user: User): boolean {
  // something interesting
  return true;
}

const validate = objectOf<User>({ ... });

const isAdminResult: Result<boolean> = mapResult(validate(JSON.parse("{...}"), isAdmin);

And for the sake of completeness, the comparable mapFailure:

function mapFailure<A,B>(result: Result<A>, map: (value: Failure) => B): Success<A>|B {
  if (result.type === 'failure') {
    return map(result);
  }
  return result;
}

Why would you want this? It allows you to write pure functions in your business domain, like isAdmin above, and then combine them with the Validator<T> domain, without using any glue code.

The fewer lines of code, the fewer variables to type. And we have tsc there to let us know when the function signatures don’t match.

For instance trying to use a function that takes something other than a User is going to fail type analysis when used with mapResult(Result<User>, ...).

The less often you need to cross domains within your APIs, the more decoupled they are.

Validating Array

A Validator<T> returns a Result<T>. What if we wanted to continue validating T and turn it into another type? Let’s consider Array.

The first step to turning an any type into an Array<T> is first checking if is in fact an Array.

This is similar to our other base validators:

const isArray: Validator<any[]> = value =>
    Array.isArray(value) ? success(value) : failure('value is not an array');

The next step is iterating through each member in the Array<any> and validating the member. Since we’re practicing Type Driven Development, we’ll start with the type signature.

function arrayOf<T>(validator: Validator<T>): Validator<Array<T>> {
}

And jus like before tsc isn’t happy:

A function whose declared type is neither 'void' nor 'any' must return a value.

We just defined isArray. It would be neat if we could use it here. Thinking about it, it would be nice to be able to take the success case of isArray and then do more validation to it and return a mapped Result<Array<T>>.

Let’s write one more combinator that maps a Validator<A> into a Validator<B> given a function of (value: A) => Result<B>.

function mapValidator<A, B>(
  validator: Validator<A>,
  map: (value: A) => Result<B>
): Validator<B> {

}

If the Result<A> case is a Failure, it should be returned right away, but if it’s a Success<A> we want to unbox it and give it to (value: A) => Result<B>.

Does that sound familiar? We want to map the success result of Validator<A>. That’s mapSuccess. We can define mapValidator in terms of mapSuccess:

function mapValidator<A, B>(
  validator: Validator<A>,
  map: (value: A) => Result<B>
): Validator<B> {
    return value => mapSuccess(validator(value), map);
}

Using mapValidator allows us to define a validation in terms of another Validator<T>.

So now we can define Validator<Array<T>> in terms of Validator<any[]>:

function arrayOf<T>(validate: Validator<T>): Validator<Array<T>> {
    return mapValidator(isArray, (value) => {
    
    });
}

At this point tsc can determine that value is type any[]. But to satisfy Validator<Array<T>> we need to validate each member of any[] with Validator<T>.

If any item fails validation, the whole Array fails validation. So not only are we validating each member, but potentially returning a Failure case. We need to reduce any[] to Result<Array<T>>.

We can seed the reduce call with an empty success case:

return mapValidator(isArray, (value) =>
    value.reduce<Result<T[]>>(
        (result, member) => undefined,
        success([])
    )

But what to use for our reduce function? We’re declaring to Array.prototype.reduce that the first argument and return value is a Result<T[]>. That means the type of our reduce function needs to be of type:

(result: Result<T[]>, member: any, index: number) => Result<T[]>

If result is ever the Failure case, we don’t want to do anything, we only want to handle the Success<T[]> case. That’s another case for mapSuccess:

(result, member, index) => mapSuccess(
    result,
    (items) => 
)

Now that we are within an iteration of the array, we have enough context to use our Validator<T> on the member. If it’s successful, we want to concat it with the rest of items, if a failure, we’ll just return it (for now).

Another case for mapSuccess:

(result, member, index) => mapSuccess(
    result,
    (items) => mapSuccess(
        validate(member),
        valid => success(items.concat([member]),
    )
)

And here’s the complete arrayOf:

function arrayOf<T>(validate: Validator<T>): Validator<Array<T>> {
    return mapValidator(isArray, (value) =>
        value.reduce<Result<T[]>>(
            (result, member, index) => mapSuccess(
                result,
                items => mapSuccess(
                    validate(member),
                    valid => success(items.concat([valid])
                )
            ),
            success([])
        )
    );
}

In a test:

describe('arrayOf', () => {
   const validate = arrayOf(objectOf({ name: isString }));
   it('succeeds', () => {
      const values = [{name: 'Rumpleteazer'}];
      expect(validate(values)).toEqual(success(values));
   });

   it('fails', () => {
      const values = [{name: 1}];
      expect(validate(values)).toEqual(failure(values, 'Failed at \'name\': typeof value is number');
   });
});

One last thing before we tie a ribbon on Validator<T>. The Falure case reason says:

"Failed at 'name': typeof value is number"

In the context of .reduce we know which index we are currently on while iterating. So when we validate the member, we can use mapFailure to enhance the Failure case. Here’s the new reducer:

(result, member, index) => mapSuccess(
    result,
    items => mapSuccess(
        mapFailure(
            validate(member),
            failure => keyedFailure(items, index, failure)
        ),
        valid => success(items.concat([valid])
    )
),

And now the Failure reason is:

"Failed at '0': Failed at 'name': typeof value is string"

Wrapping It Up

I have now used this library to create type safety for all of my project’s JSON based REST APIs.

Functions that once used half of their lines for type refinements are now one mapSuccess away type safe response values.

Taking my API responses was a matter of mapping my JSON decoders to Validator<T> instances.

Before:

export const v3SubmitOrders = jsonEncodedRequest(
    fw(build.post('/v3/submit_orders')),
    ({options}: SubmitOrders) => ({orders: options.orders, validate_only: options.validate_only !== false}),
    response.decodeJson
);

After:

export const v3SubmitOrders = jsonEncodedRequest(
    fw(build.post('/v3/submit_orders')),
    ({options}: SubmitOrders) => ({orders: options.orders, validate_only: options.validate_only !== false}),
    response.mapHandler(response.decodeJson, objectOf({
        status: validateStatus,
        orders: arrayOf(objectOf({
            order_po: isString,
            order_id: isNumber,
            order_confirmation_id: isNumber,
            order_confirmation_datetime: isString,
        })),
        debug: isAnyValue,
        misc: isAnyValue,
    }))
);

One Promise resolver later, and I have type safe JSON responses:

cost result = await v3SubmitOrders({orders: [123]).then(requireValidResponse);

Implementing a Validator<T> not only provides type safety, it also provides better documentation.

Without fail, every time I approach an API using Lambda calculus principles I end with an API that is declarative and easy to combine.

Categories
Programming

Strongly Typed WP-API

The first in a series of posts exploring WP-API with statically typed PHP and Functional Programming patterns.

The Context

To expose a resource as an endpoint via WordPress’ WP-API interface one must use register_rest_route.

/**
 * Registers a REST API route.
 *
 * Note: Do not use before the {@see 'rest_api_init'} hook.
 *
 * @since 4.4.0
 * @since 5.1.0 Added a _doing_it_wrong() notice when not called on or after the rest_api_init hook.
 *
 * @param string $namespace The first URL segment after core prefix. Should be unique to your package/plugin.
 * @param string $route     The base URL for route you are adding.
 * @param array  $args      Optional. Either an array of options for the endpoint, or an array of arrays for
 *                          multiple methods. Default empty array.
 * @param bool   $override  Optional. If the route already exists, should we override it? True overrides,
 *                          false merges (with newer overriding if duplicate keys exist). Default false.
 * @return bool True on success, false on error.
 */
function register_rest_route( $namespace, $route, $args = array(), $override = false ) {

The documentation here is incredibly opaque so it’s probably a good idea to have the handbook page open until the API is internalized in your brain.

The $namespace and $route arguments are somewhat clear, however in typical WordPress PHP fashion the bulk of the magic is provided through an opaquely documented @param array $args.

The bare minimum are the keys method and callback and for our purposes will be all that we need. WP_REST_Server provides some handy constants (READABLE, CREATABLE, DELETABLE, EDITABLE) for the methods key so that leaves callback.

What is callback? In PHP terms it’s a callable. Many things in PHP can be a callable. The most commonly used callable for WordPress tends to be a string value that is the name of a function:

function my_callable() {
}
register_rest_route( 'some-namespace', '/some/path', [ 'callback' => 'my_callable' ] );

This would call my_callable, and as is would probably return 200 response with an empty body.

What would me more useful than just callable would be a callable that can define its argument types and return types.

Types and PHP

The ability to verify the correctness of software with strongly typed languages is an obvious benefit to using them.

However, an additional benefit is how the types themselves become the natural documentation to the code.

PHP has supported type hinting for a while:

function totes_not_buggy( WP_REST_Request $request ) WP_REST_Response {
}

With type hints the expectations for totes_not_buggy() are much clearer.

Adding these type hints means at runtime PHP will enforce that only instances of WP_REST_Request will be able to be used with totes_not_buggy(), and that totes_not_buggy() can only return instances of WP_REST_Response.

This sounds good except that this is enforced at runtime. For true type safety we want something better, we want static type analysis. Types should be enforced without running the code.

For this exercise, Psalm will provide static type analysis via PHPDoc annotations.

/**
 * Responds to a REST request with text/plain "You did it!"
 *
 * @param WP_REST_Request $request
 * @return WP_REST_Response
 */
function totes_not_buggy($request) {
   return new WP_REST_Response( 'You did it!', 200, ['content-type' => 'text/plain' );
}

Ok this all sounds nice in theory, how do we check this with Psalm?

To the terminal!

mkdir -p ~/code/wp-api-fun
cd ~/cod/wp-api-fun
composer init

Accept all the defaults and say “no” to the dependencies:

Package name (<vendor>/<name>) [beaucollins/wp-api-fun]: 
Description []: 
Author [Beau Collins <beau@collins.pub>, n to skip]: 
Minimum Stability []: 
Package Type (e.g. library, project, metapackage, composer-plugin) []: 
License []: 
Define your dependencies.
Would you like to define your dependencies (require) interactively [yes]? no
Would you like to define your dev dependencies (require-dev) interactively [yes]? no
{
    "name": "beaucollins/wp-api-fun",
    "authors": [
        {
            "name": "Beau Collins",
            "email": "beau@collins.pub"
        }
    ],
    "require": {}
}
Do you confirm generation [yes]? 

Now install two dependencies:

  • vimeo/psalm to run type checking
  • php-stubs/wordpress-stubs to type check against WordPress APIs
composer require --dev vimeo/psalm php-stubs/wordpress-stubs

Assuming success try to run Psalm:

./vendor/bin/psalm
Could not locate a config XML file in path /Users/beau/code/wp-api-fun/. Have you run 'psalm --init' ?

To keep things simple with composer, define a single PHP file to be loaded for our project at the path ./src/fun.php:

mkdir src
touch src/fun.php

Now inform composer.json where this file is via the "autoload" key:

{
    "name": "beaucollins/wp-api-fun",
    "authors": [
        {
            "name": "Beau Collins",
            "email": "beau@collins.pub"
        }
    ],
    "require": {},
    "require-dev": {
        "vimeo/psalm": "^3.9",
        "php-stubs/wordpress-stubs": "^5.3"
    },
    "autoload": {
        "files": ["src/fun.php"]
    }
}

Generate Psalm’s config file and run it to verify our empty PHP file has zero errors:

./vendor/bin/psalm --init
Calculating best config level based on project files
Calculating best config level based on project files
Scanning files...
Analyzing files...
░
Detected level 1 as a suitable initial default
Config file created successfully. Please re-run psalm.
./vendor/bin/psalm
Scanning files...
Analyzing files...
░
------------------------------
No errors found!
------------------------------
Checks took 0.12 seconds and used 37.515MB of memory
Psalm was unable to infer types in the codebase

For a quick gut-check define totes_not_buggy() in ./src/fun.php:

<?php
// in ./src/fun.php
/**
 * Responds to a REST request with text/plain "You did it!"
 *
 * @param WP_REST_Request $request
 * @return WP_REST_Response
 */
function totes_not_buggy($request) {
   return new WP_REST_Response( 'You did it!', 200, ['content-type' => 'text/plain' );
}

Now analyze with Psalm:

./vendor/bin/psalm
./vendor/bin/psalm
Scanning files...
Analyzing files...
E
ERROR: UndefinedDocblockClass - src/fun.php:6:11 - Docblock-defined class or interface WP_REST_Request does not exist
 * @param WP_REST_Request $request
ERROR: UndefinedDocblockClass - src/fun.php:7:12 - Docblock-defined class or interface WP_REST_Response does not exist
 * @return WP_REST_Response
ERROR: MixedInferredReturnType - src/fun.php:7:12 - Could not verify return type 'WP_REST_Response' for totes_not_buggy
 * @return WP_REST_Response
------------------------------
3 errors found
------------------------------
Checks took 0.15 seconds and used 40.758MB of memory
Psalm was unable to infer types in the codebase

Psalm doesn’t know about WordPress APIs yet. Time to teach it where those are by adding the stubs to ./psalm.xml:

    <stubs>
        <file name="vendor/php-stubs/wordpress-stubs/wordpress-stubs.php" />
    </stubs>
</psalm>

One more run of Psalm:

./vendor/bin/psalm     
Scanning files...
Analyzing files...
░
------------------------------
No errors found!
------------------------------
Checks took 5.10 seconds and used 356.681MB of memory
Psalm was able to infer types for 100% of the codebase

No errors! It knows about WP_REST_Request and WP_REST_Response now.

What happens if they’re used incorrectly like a string for the status code in the WP_REST_Response constructor:

ERROR: InvalidScalarArgument - src/fun.php:10:48 - Argument 2 of WP_REST_Response::__construct expects int, string(200) provided
   return new WP_REST_Response( 'You did it!', '200', ['content-type' => 'text/plain'] );

Nice! Before running the PHP source, Psalm can tell us if it is correct or not. IDE’s that have Psalm integrations show the errors in-place:

Screen capture of Visual Studio Code with fun.php open and the Psalm error displayed in a tool tip.
Visual Studio Code with the Psalm extension enabled showing the InvalidScalarArgument error. ]

Now to answer the question “which type of callable is the register_rest_route() callback option?”

First-Class Functions

With PHP’s type hinting, the best type it can offer for the callback parameter is callable.

This gives no insight into which arguments the callable requires nor what it returns.

With Psalm integrated into the project there are more tools available to better describe this callable type.

callable(Type1, OptionalType2=, SpreadType3...):ReturnType

Using this syntax, the callback option of $args can be described as:

callable(WP_REST_Request):(WP_REST_Response|WP_Error|JSONSerializable)

This line defines a callable that accepts a WP_REST_Request and can return one of WP_REST_Response, WP_Error or JSONSerializable.

Once returned, WP_REST_Server will do what is required to correctly deliver an HTTP response. Anything that conforms to this can be a callback for WP-API. The WP-API world is now more clearly defined:

callable(WP_REST_Request):(WP_REST_Response|WP_Error|JSON_Serializable)

To illustrate this type at work define a function that accepts a callable that will be used with register_rest_route().

Following WordPress conventions, each function name will be prefixed with totes_ as an ad-hoc namespace of sorts (yes, this is completely ignoring PHP namespaces).

/**
 * @param string $path
 * @param (callable(WP_REST_Request):(WP_REST_Response|WP_Error|JSONSerializable)) $handler
 * @return void
 */
function totes_register_api_endpoint( $path, $handler ) {
   register_rest_route( 'totes', $path, [
      'callback' => $handler
   ] );
}
add_action( 'rest_api_init', function() {
   totes_register_api_endpoint('not-buggy', 'totes_not_buggy');
} );

A quick check with Psalm shows no errors:

------------------------------
No errors found!
------------------------------

What happens if the developer has a typo in the string name of the callback totes_not_buggy? Perhaps they accidentally typed totes_not_bugy?

ERROR: UndefinedFunction - src/fun.php:24:45 - Function totes_not_bugy does not exist
   totes_register_api_endpoint('not-buggy', 'totes_not_bugy');

Fantastic!

What happens if the totes_not_buggy function does not conform to the callable(WP_REST_Request):(...) type? Perhaps it returns an int instead:

/**
 * Responds to a REST request with text/plain "You did it!"
 *
 * @param WP_REST_Request $request
 * @return int
 */
function totes_not_buggy( $request ) {
   return new WP_REST_Response("not buggy", 200, ['content-type' => 'text/plain']);
}
ERROR: InvalidArgument - src/fun.php:24:45 - Argument 2 of totes_register_api_endpoint expects callable(WP_REST_Request):(JSONSerializable|WP_Error|WP_REST_Response), string(totes_not_buggy) provided
   totes_register_api_endpoint('not-buggy', 'totes_not_buggy');

The callable string 'totes' no longer conforms to the API. Psalm is catching these bugs before anything is even executed.

But Does it Work?

Psalm says this code is correct, but does this code work? Well, there’s only one way to find out.

First, turn./src/fun.php into a WordPress plugin with the minimal amount of header comments:

<?php
/**
 * Plugin Name: Totes
 */

And boot WordPress via wp-env:

npm install -g @wordpress/env
echo '{"plugins": ["./src/fun.php"]}' > .wp-env.json
wp-env start
curl http://localhost:8889/?rest_route=/ | jq '.routes|keys' | grep totes

There are the endpoints:

curl --silent http://localhost:8889/\?rest_route\=/ | \
  jq '.routes|keys' | \
  grep totes
  "/totes",
  "/totes/not-buggy",
curl http://localhost:8889/\?rest_route\=/totes/not-buggy
"not buggy"

Well it works, but there’s a small problem. It looks like WordPress decided to json_encode() the string literal not buggy so it arrived in quotes as "not buggy" (not very not buggy).

Changing the return of totes_not_buggy to something more JSON compatible works as expected:

-    return new WP_REST_Response("not buggy", 200, ['content-type' => 'text/plain']);
+    return new WP_REST_Response( [ 'status' => 'not-buggy' ] );
curl http://localhost:8889/\?rest_route\=/totes/not-buggy          
{"status":"not-buggy"}

Automate It

Reproducing the steps to run psalm on this codebase is trivial.

With a concise Github Action definition this project can get static analysis on every push. Throw in a annotation service and Pull Request changes are marked with Psalm warnings and exceptions.

Screenshot of an annotated Pull Request on GitHub.

The Github workflow definition defines how to:

  1. Install composer.
  2. Install composer dependencies (with caching).
  3. Run composer check.
  4. Report the Psalm errors.

The Fun Part

This sets up the foundation for a highly productive development environment:

  • Psalm static analysis provides instant feedback on correctness of code.
  • wp-envallows for fast verification of running code.
  • GitHub Actions automates type checking as an ongoing concern.

Coming up: exploring functional programming patterns for WP-API with the help of Psalm.

Categories
Uncategorized

The number of people walking around Disneyland® with $200.00 plastic light swords is too damn high!

Correction: I have been told the handles are made of metal materials. Sentiment still stands.

Categories
Uncategorized

Hello, You Have Recently Written a Long Form Article Using Gutenberg

Please use the number that best indicates your pain level.

9

It could always be worse.

Categories
Uncategorized

Slack Geekbot Lifehack

Mark yourself as away and Geekbot never bothers you.

Categories
Uncategorized

Dryer Outage Postmortem

Seattle, Washington. Tuesday, December 10th, 2019 the Collins household discovered the control panel of the LG DLE2516W did not respond to button pushes.

DuckDuckGo was consulted. LG support provides troubleshooting steps.

  • Unplug unit.
  • Press & hold power button for 5 seconds.
  • Press & hold start button for 5 seconds.
  • Plug in unit.

Instructions were followed with no change in behavior. Unit still unresponsive.

Household administrator researches potential fixes. Likely culprits:

  • Circuit tripped at circuit breaker.
  • Faulty door switch.
  • Blown high temperature fuse in heater.

Multimeter is not on premises. Multimeter, door switch, and high temperature fuse purchased from Amazon.com.

Thursday, December 12th, 2019. Door switch arrives. Replaced on unit. No change. Collins leave on trip to Portland, Oregon.

Tuesday, December 17th, 2019. Multimeter and fuse arrive. High voltage outlet and unit power supply measure 240v.

To access high temperature fuse entire unit must be disassembled.

Unit disassembled. Multimeter finds fuse to be in working order. Replaced anyway. Ducts inspected and reveal no blockage.

Unit reassembled. Control board lights up when power button pressed. When start is pressed a relay clicks but motor does not engage (queue Picard).

Unit disassembled. Previous reassembly found to be performed by incompetent technician. Assembler received cursing. Curser and cursee roles were fulfilled by the same person.

Unit reassembled. Powers on. Motor starts.

There was much rejoicing.

Steps to prevent future outages: clean the lint from the filter.

Categories
Uncategorized

Dead Calm on the Strait

Categories
Uncategorized

Sunrise at Lopez Island

Getting up early to prepare to cross the Strait of Juan de Fuca.

Categories
Uncategorized

Guérilla Marketing

Time to try Pho Tran on The Ave.

Categories
Uncategorized

Poulsbo Viking Fest

Tracked route from Shilsole to Poulsbo and back.

After some back and forth about the weather we decided to shove off and sail to Poulsbo for the Viking Fest.

Ballard does its own festival but we thought we’d up the ante by sailing from one Norwegian festival to another.

We left the dock just before noon and arrived in Poulsbo at 14:05.

To guarantee we’d arrive for the festivities we motored the whole way there. In the channels Nirvana goes just over 6 knots with the 9.9hp motor opened up. We had the current at our stern as we entered Agate Pass and reached 10.2 knots over ground.

We all watched the parade as it marched through the main street of Poulsbo. From what I can tell it’s a town proud if its Viking heritage, outdoors kids clubs, high school marching bands, and beauty pageant winners.

On the return trip I hoisted the sails at Point Bolin and got a couple tacks in before motoring back through Agate pass. We were leaving Madison Bay as the sun was setting so we dropped the sails and motored back to help keep the passengers from being out in too cold of an evening.

The winds really picked up as we crossed the Puget Sound and would have made for a fun crossing under sail.

Well over 4 hours on the water to log towards the captain’s license!