Announcing External Pipeline beta

16 Feb 2021

We thought it could be interesting to see if people could make use of the core ability of Restspace to compose HTTP services if it was extracted and exposed on the web as a service itself. We've now got a basic version up and running without any accounts or management around it, so still very much beta. It runs on Cloudflare as a worker and the url is https://external-pipeline.r-s.workers.dev. It's free, but it's a free Cloudflare account so you might hit the built in request limit if this suddenly gets very popuplar! (BTW this means it's quite a bit slower than it will be when we get it onto a paid-for account). We're planning to keep this service free up to around 1 million reqs/month.

So what does this actually do? Well you attach a JSON pipeline specification to your request (either as the POST body or as a query string argument called $pipeline) and the service runs the pipeline and returns the result. What's a pipeline specification? It's described in the Restspace docs, but the idea is that it's basically a sequence of urls that you request in a chain, piping the output of the first into the request to the second etc. It has a lot of programming language-like structure around it: you can put conditions on steps, run sections in parallel, split and combine messages (e.g. using zip) and also transform data between steps.

Here's an example request, which reads the content JSON for this blog and extracts only the code blocks:

POST https://external-pipeline.r-s.workers.dev
[
  "targetHost https://restspace.io",
  "GET /json/blog/Announcing%20External%20Pipeline%20beta",
  {
    "codeBlocks":
    [
      "expressionMap",
      [
        "expressionFilter", "blocks", "blockType === 'code'"
      ],
      "content"
    ]
  }
]

The first use case we imagine for this is as performing the function of GraphQL in terms of combining data from multiple sources and shaping it, but without the need to build a server to do it! And without scaling problems and incredibly cheap.

But beyond that Restspace pipelines are aiming to promote a small but powerful change in how we view building things on the web. It encourages viewing HTTP services as data processors, not just sources of data like a web page or sinks of data like when you write to an API. The enables for instance, taking a JSON file, piping it into a template, then piping the result into an email service.

Restspace pipelines work with any kind of payload, not just JSON. So for instance, a user chooses some image files to download, then you send a pipeline to this service which requests those files in parallel and packages the result into a zip which it then returns for your user to download.

You can see examples and try out some requests at the pipeline playground.

We still have a lot of work to do on this service. It currently doesn't let you unzip a zip file or combine and split multipart messages which the main Restspace pipeline will let you do so we want to fix that soon. Also we want to put an account system around it which will allow you to use secrets within your pipelines. At the moment, the only way to deal with authenticated services is via Authorization header forwarding. The pipeline will forward the Authorization header on the original request to any request made in the pipeline to the default target host (see the docs for the targetHost directive). This is limited and requires you to hold an API key or JWT in memory which isn't super secure.

Let us know anything else you'd like to see and any feedback on feedback@restspace.io. Don't hesitate to write, we really need your input.