Tech Blog


Swagger SDK

posted Apr 3, 2016, 8:39 AM by Val Huber

Swagger empowers a community of SDK kits for code generation, like this Java SDK for Swagger, which looks like this:

CustomerBusinessObjectApi();

ApiClient client = cboApi.getApiClient();

client.setBasePath("http://localhost:8080/APIServer/rest/default/demo/v1");

client.addDefaultHeader("Authorization", "CALiveAPICreator demo_full:1");

client.setDateFormat(new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSS"));


List<CustomerBusinessObject> cbos = cboApi.customerBusinessObjectGet(null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null);

for (CustomerBusinessObject cbo : cbos) {

System.out.println("Customer: " + cbo.getName() + " has balance " + cbo.getBalance());

List<CustomerBusinessObjectOrders> orders = cbo.getOrders();

for (CustomerBusinessObjectOrders order : orders) {

System.out.println("    Order: " + order.getOrderNumber() + " has total " + order.getAmountTotal());

List<CustomerBusinessObjectOrdersLineItems> items = order.getLineItems();

for (CustomerBusinessObjectOrdersLineItems item: items) {

System.out.println("        Item " + item.getLineitemId() + " has amount " + item.getAmount());

}

}

}



Docker for Databases

posted Apr 2, 2016, 7:39 PM by Val Huber

You can use the Docker container for Postgres.

Create Docker

Under Docker Shell:

docker run --name some-postgres -p 5432:5432 -e POSTGRES_PASSWORD=mysecretpassword -d postgres 

later, you can stop it like this:

docker rm -f postgres

Important! Docker containers are code only - no persistence. This database loses all data, including tables, when you stop the container. It is therefore useful for demos and experiments, but not for production.  

Connect with Live API Creator

Connect like this:

Docker Database Images

MySQL

Versions available: 5.5, 5.6, 5.7

docker run --name MySQL56 -p 3306:3306 -e MYSQL_ROOT_PASSWORD=Password1 -d mysql:5.6

Connecting

Host name: 192.168.99.100 (usually)

Port: 3306

User: root

Password: Password1

DB2

DB2 is different. You have to run a few manual commands. Leave the terminal open.

Version 10.5:

docker run --name DB2_10 -i -t -p 50000:50000 -e DB2INST1_PASSWORD=expresso123 -e LICENSE=accept ibmcom/db2express-c:latest bash

Then:

su - db2inst1
db2start
db2sampl

Connecting

Host name: 192.168.99.100 (usually)

Port: 3306

User: db2inst1

Password: expresso123

Database: SAMPLE

Oracle

Version 11gdocker run -d -p 1521:1521 wnameless/oracle-xe-11g

Connecting

Host name: 192.168.99.100 (usually)
Port: 1521
SID: xe
User: system
Password: oracle

PostgreSQL

Versions available: 9.1, 9.2, 9.3, 9.4, 9.5

docker run -d --name Postgres95 -p 5432:5432 -e POSTGRES_PASSWORD=mysecretpassword postgres:9.5

Connecting
Host name: 192.168.99.100 (usually)
Port: 5432
Database: postgres
User: postgres
Password: mysecretpassword

Let's speed up Protractor's data lookup

posted Feb 16, 2016, 1:41 PM by Michael Holleran

Protractor is the favored testing framework for end-to-end testing of AngularJS apps. It's quite good, and I especially like the smooth integration with Sauce Labs.

But Protractor can sometimes be very slow. For instance, if we have a page that shows a data table with (say) 3 columns (name, balance and creditLimit) and 20 rows, and I want to get all the values displayed in that table, the "normal" way to retrieve the data in that table would be something like:

// Parameters:
// tableSelector: the css selector for the table rows, e.g. "#leftGridContainer .ngRow"
// columnSelector: the additional css selector for the columns in a row, e.g. ".ng-binding"
// colNames: an array of strings with the names of the columns

var getTableValues = function(tableSelector, columnSelector, colNames) {
  return element.all(by.css(tableSelector)).map(function(row, index) {
    var columns = row.all(by.css(columnSelector));
    return columns.then(function(cols){
      var result = {};
      cols.forEach(function(col, idx) {
        result[colNames[idx]] = col.getText();
        result.rowElm = row;
      });
      return result;
    });
  });
};

This works. The problem is that it takes about 7 seconds -- and that's every time we want to get these values. Because these values may change, we do need to re-fetch them every time we want a fresh look at them. That makes our test script run like molasses on a cold day -- there is a 7 second delay every time we need to check something in that table.

There is a workaround, which is to bypass WebDriver and go straight to the browser. This is more convoluted, but it's well worth it. The idea is to generate a piece of code that will then be run (carefully -- we don't want to disturb it) in the browser. We'll then parse its output.

var getTableValues = function(tableSelector, columnSelector, colNames) {
  return browser.driver.executeScript("return (function(){" +
    "var rows = []; var row = {}; var colNames = " + JSON.stringify(colNames) + ";" +
    "angular.element('" + tableSelector + " " + columnSelector + "')" +
    ".each(function(idx, c) {" +
    "  var colIdx = idx % colNames.length;" +
    "  row[colNames[colIdx]] = $(c).text();" +
    "  if (colIdx == colNames.length - 1) {" +
    "    rows.push(row);" +
    "    row = {};" +
    "  }" +
    "});" +
    "return JSON.stringify(rows);" +
    "})();").
  then(function(s) {
  var data = JSON.parse(s);
  // We have the table data, now supplement it with the WebElement for each row
  return element.all(by.css(tableSelector)).then(function(rows) {
    _.each(rows, function(row, idx) {
      data[idx].rowElm = row;
    });
    return data;
  });
});
};

This can be called like this:

var tableDataPromise = getTableValues('#leftTableDiv .ngRow', '.ngCellText .ng-binding', ["name", "balance", "creditLimit"]);

On my machine, this runs in under 100ms. That is a lot quicker than 7 seconds.

For a relatively simple script, this brought our total execution time from 2.5 minutes to under a minute -- a big improvement!

I wish WebDriver wasn't this sluggish when navigating the DOM, but with this approach, at least, there is a way to address the biggest bottlenecks.

Handling Links in JSON

posted Feb 16, 2016, 1:41 PM by Michael Holleran   [ updated Aug 15, 2016, 1:48 PM by Laura Carrubba ]

The Live API Creator team has been working on our handling of links, and a lot of question have arisen. There is a lot of confusion over how to represent links.

One possibility:

   "customer" : "http://rest.logicbeam.com/v1/customer/123"

The problem with that is that it's not immediately obvious to a program that this is a link (it could be a piece of text that just happens to be a URI), and even worse, this tells us nothing about what's on the other side of this link, or how we're related to it.

Another possibility is to have composite links, e.g.:

{
  "href" : "http://rest.logicbeam.com/v1/employee/456",
  "rel" : "owner",
  "title" : "Owner",
  "type" : "http://rest.logicbean.com/v1/employee"
}

This is much more descriptive, although there is still a lot of uncertainty regarding the "proper" values for rel and type.

There is in fact a registry of "legal" values for the rel attribute, but a quick look at it should convince you that it's not all that useful in a REST/JSON context.

It's become quite common for JSON objects to include a links section, for example:

{
  "name" : "Billy Bob's bait shop",
  "address" : "123 Main st, Anytown, USA"
  "links" : [
    {
      "href" : "http://rest.logicbeam.com/v1/employee/456",
      "rel" : "owner",
      "title" : "Owner",
      "type" : "http://rest.logicbean.com/v1/employee"
    }
  ]
}

This makes it very clear, but it's also rather verbose.

http://www.mnot.net/blog/2011/11/25/linking_in_json

1-4 of 4