### Pricing things in USD onchain

Posted:

**Wed Feb 26, 2020 7:45 pm**There's some suggestions of making various costs pegged to the Namecoin/USD value, so renewing a domain costs X USD instead of X namecoins. This however presupposes that you know the exchange rate. Mechanisms for this have some issues. To quote a senior developer:

Divide the time up into periods of N days. In this example, I'll use N = 7, so 1 week.

For each period, calculate an exchange rate. This should be done using some robust formula - I propose median of closing prices. You might as well round this to the nearest X% to make some calculations easier, so store floor(log1.05(exchange rate)) to get a nice integer.

In each coinbase, miners commit to the exchange rate in the previous period (T-1) and the one before that (T-2). The T-2 commitments are binding, the T-1 commitments are not.

If a node sees a T-2 commitment it doesn't agree with, it penalizes the chainwork of that block. Not heavily, and if it's wrong it should still converge eventually, but miners should have a higher probability of getting their block orphaned if they use a bad value for the commitment.

After T-2 has become T-3, it takes the median and uses this value as the USD value until the end of that period.

The advantage of this approach is that it's clear as day what the value should be: getting fresh data to the nodes requires an API, but getting two week old data could easily be done by hand, and in the event an API is used it can be done in a much more robust way.

If 200 APIs are hardcoded into the client, it could pick 20 at random, throw out the 5 lowest/highest, and compute the variance. If it's too high, it asks the human to decide. Otherwise, it takes the median and calls it a day.

If these mechanisms fail, it can rely on the previous indicative exchange rate commitment and hope the gentlemen's agreement not to manipulate it hasn't been violated.

To further disincentivize manipulation, it could be clamped to something like ± 25% per week, like the block size.

A would-be manipulator would have to create 7*24*6/2+1 = 505 blocks with visibly incorrect information. For it to be a profitable manipulation, it would have to be very far out of line; 2% is not going to be close to profitable. A 10-20% discrepancy would be painfully obvious, and penalized accordingly.

To summarize: the key takeaway is that nodes don't have to know what a namecoin is worth in USD constantly, only what it was worth at a specific point in time a few weeks ago. This makes it sufficiently low-bandwidth to be handled on an ad-hoc basis.

N = 7 is not assured to be ideal. In my jerry-rigged Monte Carlo simulations, N = 31 still seems to have the 95th percentiles within 50-200% of the true value.

There is a formula for calculating this (price*annualized vol*sqrt(time/1 year) = 1 sigma), but I don't know how you change it to use log-normal. An analysis made by someone with better quantitative skills than I to determine the optimum N would be much appreciated.

Here is my proposal which should hopefully be less brittle:IIRC the previous discussions concluded it

(1) Gives way too much power to miners;

(2) Gives way too much power to exchange rate API operators;

(3) Seems redundant because it's basically just reinventing the concept of stablecoins, and it would be easier to make Namecoin a pegged sidechain of a stablecoin [...]

Using the median of the last N blocks' commitments was (AFAIK) the best (or rather the least bad) proposed way to keep it stable from the old discussions.

Divide the time up into periods of N days. In this example, I'll use N = 7, so 1 week.

For each period, calculate an exchange rate. This should be done using some robust formula - I propose median of closing prices. You might as well round this to the nearest X% to make some calculations easier, so store floor(log1.05(exchange rate)) to get a nice integer.

In each coinbase, miners commit to the exchange rate in the previous period (T-1) and the one before that (T-2). The T-2 commitments are binding, the T-1 commitments are not.

If a node sees a T-2 commitment it doesn't agree with, it penalizes the chainwork of that block. Not heavily, and if it's wrong it should still converge eventually, but miners should have a higher probability of getting their block orphaned if they use a bad value for the commitment.

After T-2 has become T-3, it takes the median and uses this value as the USD value until the end of that period.

The advantage of this approach is that it's clear as day what the value should be: getting fresh data to the nodes requires an API, but getting two week old data could easily be done by hand, and in the event an API is used it can be done in a much more robust way.

If 200 APIs are hardcoded into the client, it could pick 20 at random, throw out the 5 lowest/highest, and compute the variance. If it's too high, it asks the human to decide. Otherwise, it takes the median and calls it a day.

If these mechanisms fail, it can rely on the previous indicative exchange rate commitment and hope the gentlemen's agreement not to manipulate it hasn't been violated.

To further disincentivize manipulation, it could be clamped to something like ± 25% per week, like the block size.

A would-be manipulator would have to create 7*24*6/2+1 = 505 blocks with visibly incorrect information. For it to be a profitable manipulation, it would have to be very far out of line; 2% is not going to be close to profitable. A 10-20% discrepancy would be painfully obvious, and penalized accordingly.

To summarize: the key takeaway is that nodes don't have to know what a namecoin is worth in USD constantly, only what it was worth at a specific point in time a few weeks ago. This makes it sufficiently low-bandwidth to be handled on an ad-hoc basis.

N = 7 is not assured to be ideal. In my jerry-rigged Monte Carlo simulations, N = 31 still seems to have the 95th percentiles within 50-200% of the true value.

There is a formula for calculating this (price*annualized vol*sqrt(time/1 year) = 1 sigma), but I don't know how you change it to use log-normal. An analysis made by someone with better quantitative skills than I to determine the optimum N would be much appreciated.