Next (Streaming Events) Previous (Authentication & Errors)

Asynchronous Operation

The import, export and render commands support asynchronous operation.

When in synchronous mode, these commands wait until the job is complete before returning a status. Since these commands may take quite a long time to complete, this makes them vulnerable to HTTP timeouts and other types of connection problems.

Therefore, we provide an asychonrous mode for advanced users who want maximum robustness.

The asynchronous API uses the standard REST asynchronous mechanism

How asynchronous Clara.io API calls work

When an asynchronous operation is requested, the server will immediately return a 202 response without waiting for the operation to complete.

The 202 will contain a Location header redirecting to a job URL. You may poll on this job URL to determine the job status. You may determine the job status either by parsing the returned JSON or looking at the HTTP response code.

If you wish to parse the job JSON, the status attribute will contain ok or failed when the job is complete.

Parsing is not necessary, though. The job URL will return an HTTP status of 200 while the job is still in progress, and 303 when the job is complete.

The 303 response will redirect via a Location header to the output file (although this behavior can be disabled via the use of the URL query parameter redirect=false). It is not necessary (but not harmful) to follow this redirect for jobs that do not produce output, such as an import.

Cancelling a Job

One of the attributes in the job JSON will be deleteURL. If you send a DELETE request to this URL, the render job will be cancelled.

Javascript/Node.js Examples

The follow examples are in Javascript with Node.js. To use these examples, superagent needs to be installed

npm install superagent
var superagent = require('superagent');
var fs = require('fs');

function jobStatusCheck(location, callback) {
  superagent.get(location)
    .auth(username, apiToken)
    .set('Accept', 'application/json')
    .redirects(0)
    .end(function(err, response) {
      if (err || response.statusCode >= 400 || response.body.status === 'failed') {
        return callback(err ||
          new Error('Import error: '+response.statusCode+'\n'+JSON.stringify(response.body)));
      }

      if (response.statusCode === 303) return callback(null, response);

      setTimeout(function() {
        jobStatusCheck(location, callback)
      }, 5000);
    });
};
Import Example
module.exports = function(complete) {
  superagent.post('https://clara.io/api/scenes/'+sceneId+'/import?async=1')
    .attach('obj', 'examples/models/obj/zebra/zebra.obj')
    .attach('mtl', 'examples/models/obj/zebra/zebra.mtl')
    .attach('img', 'examples/models/obj/zebra/zebra-atlas.jpg')
    .auth(username, apiToken)
    .set('Accept', 'application/json')
    .end(function(err, response) {
      if (err || response.statusCode >= 400 || response.body.status === 'failed') {
        return complete(err ||
          new Error('Import error: '+response.statusCode+'\n'+JSON.stringify(response.body)));
      }

      jobStatusCheck(response.headers.location+'?redirect=false', complete);
    });
};
Render Example
module.exports = function(complete) {
  superagent.post('https://clara.io/api/scenes/'+sceneId+'/render')
    .auth(username, apiToken)
    .end(function(err, response) {
      if (err || response.statusCode >= 400 || response.body.status === 'failed') {
        return complete(err ||
          new Error('Render error: '+response.statusCode+'\n'+JSON.stringify(response.body)));
      }

      jobStatusCheck(response.headers.location, function(err, response2) {
        if (err) return complete(err);

        var stream = fs.createWriteStream('result.jpg');
        stream.on('close', complete);

        superagent.get(response2.headers.location)
          .auth(username, apiToken)
          .pipe(stream);
      });
    });
};

Import Example using Curl and Bash

LOCATION=$(curl -X POST -D- \
    -u username:c3be3060-fe81-467b-aa62-0ee42eea9c8b \
    -X POST \
    -F "obj=@butcherboy/butcherboy.obj" \
    -F "mtl=@butcherboy/butcherboy.mtl" \
    -F "img=@butcherboy/ButcherBoyDiffMap.jpg" \
    -F async=1 \
    "https://clara.io/api/scenes/de8891ef-6ee5-4a2c-8884-507e3648ea93/import" |
     sed -En 's/^Location: (.*)\r/\1/p')
while [[ $(curl -s -w "%{http_code}" -o /dev/null \
           -u username:c3be3060-fe81-467b-aa62-0ee42eea9c8b \
           "${LOCATION}") != 303 ]] ; do
    sleep 60
done

Render Example using Curl

LOCATION=$(curl -X POST -D- -u username:c3be3060-fe81-467b-aa62-0ee42eea9c8b \
    https://clara.io/api/scenes/53083190-2933-4d22-b38b-e74e677e89a1/render |
     sed -En 's/^Location: (.*)\r/\1/p')
while [[ $(curl -s -w "%{http_code}" -o /dev/null \
           -u username:c3be3060-fe81-467b-aa62-0ee42eea9c8b \
           "${LOCATION}") != 303 ]] ; do
    sleep 60
done
curl -L -u username:c3be3060-fe81-467b-aa62-0ee42eea9c8b "${LOCATION}" -o render.jpg

Export Example using Python

Python’s urllib automatically follows redirects, and doesn’t provide an option to disable this behaviour. Therefore, instead of using the HTTP response code to determine whether the job is done or not, we’re inspecting the data we received to see if it looks like JSON.

This works fine, but if you want to do things “correctly” with the HTTP status code, you can use either the low level http.client or requests

We highly recommend the use of requests – it would result in much simpler code than the following.

import urllib.request
import urllib.parse
import base64
import time
import json

scene = "de8891ef-6ee5-4a2c-8884-507e3648ea93"
user = "username"
token = "c3be3060-fe81-467b-aa62-0ee42eea9c8b"
clara = "https://clara.io"

params = urllib.parse.urlencode({"async": "true", "zip": "true"}).encode("utf-8")
request = urllib.request.Request("%s/api/scenes/%s/export/obj" % (clara, scene), params)
auth = base64.encodestring(("%s:%s" % (user, token)).encode('ascii')).decode('ascii').replace('\n', '')
request.add_header("Authorization", "Basic %s" % auth)
response = urllib.request.urlopen(request)
job_url = response.info().get("Location")

job_request = urllib.request.Request(job_url)
job_request.add_header("Authorization", "Basic %s" % auth)

def get_zip(job_request):
    job_response = urllib.request.urlopen(job_request)
    data = job_response.read()
    if data[0] != ord(b'{'):
        return data
    else:
        job_json = json.loads(data.decode('utf8'))
        if job_json['status'] != 'ok' and job_json['status'] != 'failed':
            time.sleep(10)
            return get_zip(job_request)
        else:
            return None

print(get_zip(job_request))

Next (Streaming Events) Previous (Authentication & Errors)