So, I’m spit-balling a problem. I’m building a property editor like I remember having access to in Visual Studio. I found bdimitrijoski’s clean-web-ui-property-grid, and it’s a good start. I built one similar last year, and then I kicked myself because I didn’t make it work well with large objects. The killer feature and prime difficulty is dealing with an array of objects (and the implicit recursion of complexity).

So, let’s start from the ground floor and define some requirements. First, as a reactive platform hoping to build all software as collaborative, the property editor must be differentiable. That is, changes to the object emit deltas rather than “hey, the object changed, and here is the final result”. By virtue of emitting deltas, it must also ingest deltas as well.

For a simple object with just finite fields (strings, numbers, booleans), this is easy as RFC 7386 provides a foundation for JSON-delta integration. Problematically, arrays and the associated indices are a nightmare. In the backend, I’ve solved the “array problem” with a delta stream that leverages object ids and an ordering array of ids which depends on a starting point.

In the property editor, I don’t exactly have the same environment as two people can insert objects and they should merge reasonable well. So, let’s call out the operations that we need to support:

- add a new element in the array at various positions (head, tail, in the middle)
- remove any element from the array
- mutate any element within the array
- arrange the order of elements

Now, we can address these problems by leveraging some inspiration from some prior work.
Namely, if we invent ids for each element in the array and then have some kind of way to order the ids then that’s *a* way.
Unlike my prior solution which is server to client synchronization, sending a complete ordering is going to have problems with conflicts.
Thus, we need to take inspiration from CRDTs..

Before we outline the CRDT, we need to establish that the entire object in all its messiess can be made into a flat table. For example, the given object:

```
{
"x": 123,
"y": 42,
"uv": {
"x": 0.5,
"y": 0.7
}
}
```

can be made into a flat table via:

key | value |
---|---|

x | 123 |

y | 42 |

uv.x | 0.5 |

uv.y | 0.7 |

The reason we do this is so the object can stored inside a table which will leverage existing infrastructure. Here, we see why arrays are so problematic.

So, let’s start with a simple array of strings. We could imagine a flat encoding of

```
{
v: [1, 2, 3]
}
```

to be:

key | value |
---|---|

v!0 | 1 |

v!1 | 2 |

v!2 | 3 |

However, now the cost to append to the head or removing the head requires a delta of the entire array. An implicit assumption and requirement at play here is to minimize the churn. Instead, we can invent “arbitrary ids”

key | value |
---|---|

v!i241 | 1 |

v!i952 | 2 |

v!i198 | 3 |

The problem now is how to arrange the objects. However, we can observe that individual entries can be added/removed/updated with minimal cost. Ordering this chaos is the next step.

We can take humble inspiration from linked lists and inject a new key-value pair.

key | value |
---|---|

v!o952 | ‘i241’ |

v!o198 | ‘i198’ |

The key idea is to let elements describe their predecessor. This in turn creates a graph, and a simple topology sort will build an order. There will ambiguity when two people introduce a new key with a shared predecessor. We can break this ambiguity by introducing the generated key as a ordering mechanism and the server and client can fix the ordering. There is also the problem of two people inventing a the same key, but this can be addressed by pushing the number of bits which diminishes the impact of the birthday paradox.

However, this is a costly approach; can we do better?

Every change to an ordering key will implicitly require topologically sorting the induced graph. For a “linked list”, this sucks. This begs two questions. First, can the transmission format be upgraded to simplify the reconstruction? Second, what caching can we leverage to accelerate this?

As an ideal, we would want to take a entire collection of these elements and then have the ordering with a complexity cost of n log_{2} n, and the cost of integrating a single change should be roughly log_{2} n.

Suppose we have N items in the array that have arrived, then the initial question is how to construct the actual array. What we have is an object of keys prefixed by ‘i’ and a graph formed by keys prefixed by ‘o’. we can construct the graph by scanning items that have a prefix of ‘o’. The graph construction is N log_{2} N. Now, we sort the items by their key and annotate each item if they have been handled. Starting with the first element. If the first element has a predecessor in the graph, then that element is handled next recursively. Once an element has been inserted into the array, we annotate the element has handled. The complexity of this construction is N log_{2} N. Let’s sketch this out:

```
function bring_the_order(raw) {
// we need an index of the items by key along
// with associated metadata
var items = {};
// a graph of an item's next elements
var graph = {};
// all the ids in the index
var ids = [];
// cache map whether or not an id is a root or not
var not_roots = {};
for (var key in raw) {
// the key is the pairing of an operation and an id
// so, extract the id here
var id = key.substring(1);
if (key[0] == 'i') { // then inspect the operation
// here, the operation is insert, so index the value and
// annotate whether or not the item has been handled
items[id] = { value: raw[key], handled: false };
ids.push(id);
} else { // key[0] == 'o'
// the operation is order the key, so build the graph up
var pred = raw[key];
if (pred in graph) {
// a conflict has two items with the same predecessor
graph[pred].push(id);
} else {
// no conflict (yet), just make the mapping
graph[pred] = [id];
}
// the value at this id is not a root
not_roots[id] = true;
}
}
// we order the ids such ambiquities are resolved
ids.sort();
// prepare the final result
var result = [];
// sub method to handle each item
var handle = function(id) {
var val = items[id];
if (val.handled) {
// the item has already been handled
return;
}
val.handled = true;
// if the item has successors, then we handle them now
if (id in graph) {
var next = graph[id];
// sort again to handle ambiquities
next.sort();
// we now insert all the values immediately
for (var kn = 0; kn < next.length; kn++) {
result.push(items[next[kn]].value);
}
// we now recurse the successors and handle them
for (var kn = 0; kn < next.length; kn++) {
handle(next[kn]);
}
}
};
// process each root node
for (var k in ids) {
var id = ids[k];
if (!(id in not_roots)) {
// write the result out
result.push(items[id].value);
// recurse and dump the successors
handle(id);
}
}
return result;
}
```

We can test this to see if works (more or less):

```
console.log(bring_the_order({
'i42':'a',
'i234':'b',
'iXYZ':'c',
'i99': 'e',
'i$':'d',
'o99':'$',
'o42':'$',
'oXYZ':'42',
'o234':'XYZ',
}));
```

which yields

```
[ "d", "a", "e", "c", "b" ]
```

I’m sure there is some l33t code way to optimize this. The important thing, however, isn’t the full construction. Rather, the differential version where a single update (or small set of updates) manifests a reconstruction the array. It would be bad form in a single update required a full N log_{2} N reconstruction. So, we ask how do we handle when a single order instruction is received. I’ll move on from full reconstruction towards partial updates, but if anyone has a better idea on the full reconstruction then I’m happy to be educated. There’s also probably a much better format for communicating the data and order within an array as well, and I’m happy to learn.

If we assume that we have an array that is appropriately ordered, then what do we need when we encounter a patch of the form:

```
{
'oX' : 'Y'
}
```

The above means The element with id ‘X’ is now after element ‘Y’. For this to be meaningful, both elements X and Y must exist. We must also assume that the delta has a complete change set. For example, replacing the first index would have a delta like this.

```
{
'iX': 'x-value'
'oX': null,
'oF': 'X',
}
```

where element F used to the be head, and now X is the head. It’s debatable whether oX needs to be in the delta, but ‘iX’ definitely needs to be in the mix. The above scenario outlines “insert at head”, so perhaps it is useful to illustrate all scenarios. In the “remove an element” scenario, we would expect a delta like

```
{
'iX': null,
'oN': 'P'
}
```

where element N is the next element after X, and element P is the element prior X. We only need oN to handle the case when we need a full reconstruction. Let’s look at swap two elements:

```
{
'oA': 'old-B-prior',
'oA-next': 'B',
'oB': 'old-A-prior',
'oB-next': 'A'
}
```

At this point, this is getting complicated. This speaks to the fact that maybe a linked list is not ideal. Instead, let’s radically simplify by using doubles to indicate a numerical order and the id. We use doubles rather than integers because we can easily bisect them. Instead of inventing random ids, we use doubles that are multiples.

This has the benefit of radically simplifying the full reconstruction, and partial construction now is an insertion sort.

```
function bring_the_order_2(m) {
var arr = [];
for (var k in m) {
arr.push({v: parseFloat(k), a: m[k]});
}
arr.sort(function(a, b) { return a.v - b.v; });
var result = [];
for (var k in arr) {
result.push(arr[k].a);
}
return result;
}
```

which we can test via

```
console.log(bring_the_order_2({
'750':'a',
'3000':'b',
'2000':'c',
'1000': 'e',
'500':'d',
}));
```

which yields

```
[ "d", "a", "e", "c", "b" ]
```

This illustrates a simpler path, but we now have to address how insertions happen. Problematically, re-ordering will break updates to the data as the key is changing, we really have to introduce a level of indirection. We re-introduce the difference between ‘i’ and ‘o’ prefixed keys.

```
function bring_the_order_3(m) {
var arr = [];
var data = {};
for (var k in m) {
if (k[0] == 'i') {
data[k.substring(1)] = m[k];
} else {
arr.push({v: parseFloat(k.substring(1)), a: m[k]});
}
}
arr.sort(function(a, b) { return a.v - b.v; });
var result = [];
for (var k in arr) {
result.push(data[arr[k].a]);
}
return result;
}
```

This level of indirection allow the order of elements and intra-element data to update independently. We test:

```
console.log(bring_the_order_3({
'i42':'a',
'i234':'b',
'iXYZ':'c',
'i99': 'e',
'i$':'d',
'o750':'42',
'o3000':'234',
'o2000':'XYZ',
'o1000':'99',
'o500':'$',
}));
```

which yields

```
[ "d", "a", "e", "c", "b" ]
```

Yay. Now, the open question is the rules to generate the keys. For the elements that have a key prefixed by ‘i’, we generate a random key of six hexidecimal digits which doesn’t already exist. This means there are 16M possible keys. For the given domain, I would expect an ideal of 10 items in any array, so let’s ask what is the probably of duplicate keys if 100 keys are generated at the same time. The probably is 0.06%, and the odds of adding 100 items at once is slim to none. If 3 people are generating 3 unique keys during the same second before joining, then the probability is 3.57628E-05%. This is sane.

For inventing the order key, the value is chosen by various rules

rule | used |
---|---|

array is empty | 1024 + 1024 * RNG() |

head of array | HEAD() / 2 + HEAD() / 4 * (0.5 - RNG()) |

end of array | TAIL() + 1024 + 1024 * RNG() |

between(A,B) | (A() + B())/2 + (B() - A()) / 4 * (0.5 - RNG()) |

This scheme exploits floating point, and we can keep the size down by preferring integers. The server can, at any time, rebase the ordering periodically; this further illustrates the importance of keeping the order changes separate from the data changes.

At this point, you may wonder why I even bother talking about the linked list approach. Well, that’s how I started! And, I don’t revise my writing that often. I’m writing this document as I’m solving the problem, and I’m happy with the numeric scheme as it is simple. This illustrates the difficulty of arrays of objects. Since the data is independent from the ordering, it also addresses how to solve recursive patching.

This work has given me a prototype flatpatch function:

```
var _flatpatch = function(obj, key, value, history, delay) {
// look for a dot and a !
var kDot = key.indexOf('.');
var kNot = key.indexOf('!');
// if the dot is present and there is no closer !
if (kDot > 0 && (kDot < kNot || kNot < 0)) {
// extract the child key
var ckey = key.substr(0, kDot);
if (!(ckey in obj)) {
obj[ckey] = {};
}
// recursively patch into the object
_flatpatch(
obj[ckey],
key.substr(kDot + 1),
value,
history + key.substr(0, kDot + 1),
delay);
return;
} else if (kNot > 0) {
// get the key prior to the !
var nkey = key.substr(0, kNot);
// infer the mode of the operation
var mode = key[kNot + 1];
if (mode == 'i') {
// data via a proxy index
// We create a proxy object to hold all the elements
var ckey = '#' + nkey;
if (!(ckey in obj)) {
obj[ckey] = {};
}
// recurse into the object
_flatpatch(
obj[ckey],
key.substr(kNot + 2),
value,
history + key.substr(0, kNot + 1),
delay);
} else if (mode == 'o') {
// order change; make sure we have a cache object
// to store the order
var ckey = '~' + nkey;
if (!(ckey in obj)) {
obj[ckey] = {};
}
var order = obj[ckey];
// if the value was to delete it, then we remove
// it from the order object
if (value == null) {
delete order[key.substr(kNot + 2)];
} else {
order[key.substr(kNot + 2)] = value;
}
var items = obj['#' + nkey];
// emit a function to build the array out of band
// this allows multiple insertions
delay[history + nkey] = function() {
var arr = [];
for (var k in order) {
arr.push({
v: parseFloat(k.substring(1)),
assoc: items[order[k]]});
}
arr.sort(function(a, b) { return a.v - b.v; });
var result = [];
for (var k in arr) {
result.push(arr[k].assoc);
}
obj[nkey] = result;
};
}
return;
} else {
// delete the value
if (value === null) {
delete obj[key];
} else {
obj[key] = value;
}
}
};
```

This unblocks a current endeavor, so I’m going to run with it!