The current mining calculators are linear. By our observations, the key drivers to projecting mining profitability exhibit exponential behavior. A better model is needed.
We’re not claiming our model is perfect – but it’s better than a linear projection.
The Simulation Model
So if you use any of the other popular mining calculators, you punch in your miner details and it tells you how much you’ll make today given your contribution to the global hash rate. Then in multiplies it by 7 to get a weekly projection. By 30 for a month. By 365ish for a year. Those pictures start to look real rosy, especially during a paradigm shift when a more efficient piece of hardware is introduced and the calculator says to you, “I know there’s no such thing as a get-rich-quick scheme but this mining calculator is for real!”. This isn’t to say that mining isn’t profitable. It is. Wildly. But a better method is needed to more accurately project what a miner will make such that one can make more accurate plans.
The premise of our model is that there are three primary driving variables to the amount of value produced by a miner: hash rate, the global hash power, and exchange rate.
Hash rate is an independent variable. It’s supplied by the user and represents the individual hash contribution (per second) that the miner contributes to the global hash rate.
The global hash rate is the sum total of all miners focused on finding a block for a given network at a given time; the total sum of all of the individual hash rate contributions.
The foreign exchange rate (hereafter “fx rate”) is the ratio of the cryptocurrency the miner produces to another variable in which value is more relatable to the user. Fx rate is obviously not required to determine value, but to the uninitiated it would be akin to attempting the mental math required to convert kilometers to miles in which the ratio swings as wildly as bitcoin price has grown. To put that in perspective taking that previous example literally, imagine if a current 10K stretched from New York to San Francisco when only six years prior it took place entirely within Central Park when someone asked you how many kilometers/miles away something was.
So global hash rate isn’t static. It’s growing – there are always new miners for sale. Miners have been getting faster, contributing more and more hash power to the global network. So as more machines are added to the network, the hash rate grows. This needs to be modeled.
The fx rate is also changing. Now I’m the first to tell you that the answer to whether or not the fx rate will go up or down is “yes”. But there is information to be gleaned from how its behaved in the past, and how it behaves with respect to other things that may or may not be related.
Deciding what is related to what, or which variable is the cause or the effect – that’s not for me to determine. Instead we model the correlation – which as they say does not imply causation – between global hash rate growth and fx appreciation/depreciation.
The Simulation Model doesn’t presume to know anything about either of the variables. Global hash rate and exchange rate are modeled as simple geometric Brownian motions. These models follow a differential stochastic process known as a Weiner process: There’s a drift or a trend component, and a random component governed by statistics like the standard deviation.
We grab hash rate history and exchange rate history and collect statistics using an exponentially-weighted moving average model. This method has the benefit of weighting more recent events more heavily and letting the effect of more distant data points fade into obscurity. This allows us to collect information on the mean which we use as the drift, and standard deviation which is represented by sigma in the geometric Brownian motion model above right.
Finally, we need to generate the random variables used in the simulation. To do so, we need to generate a value for each variable that belong to the joint-probability space; in other words, a plausible set of values given each variable’s movements and co-movements . You need to generate values that form the normal curve (image right) that was the input you gave it, with information on the distribution (sigma), the lean toward positive or negative (skewness), or the girth of the tails (kurtosis).
To perform this process multi-dimensionally, we use a create Gaussian Copulas, or uniform multivariate probability distributions (image left). Imagine taking one of the two-dimensional normal distributions and spinning it like a dreidel in three dimensions. Then recognize that each dimension might be shaped differently based on sampled properties of historic changes, like in the image above right.
This allows us to create scenarios in which both variables exist in a space that are occur within the probability space defined by the time series of co-movements.
In short, we grab the time series to determine how much the global hash rate growth varies (volatility), the direction it’s been headed lately (the mean), and how things tend to move together.
To simplify these calculations, we had to make some sweeping assumptions, which follow the explanation of the model.
We built this model in the second half of September 2017. This was a particularly interesting time to perform this analysis, yielding some interesting results.
The Simulation Model Results
The variance is too high, for a myriad of reasons.
First and foremost is recent history. We’re observing the beginnings of new paradigms of all three currencies under observation: Dash ASICs were introduced days ago, Litecoin a few months ago – still within reach of our ewma’s 72-day lookback period, and Bitcoin just forked contentiously allowing the same miner to target two different currencies: Bitcoin or Bitcoin Cash resulting changes of 50% or more to the global hash rate as the hash rate swings between BCH and BTC. Of course the global hash rate will explode up the J-curve to infinity when the standard deviation is built off of observations where the hash rate swings 50% a day.
We’ve enabled use of this model, but we expect that until things calm down (if they ever calm down), this model may produce some wonky results.
The Batch Model
During an internal discussion, we realized we had a pretty good idea of how much the hash rate grows. While creating this model, we accurately projected the hash rate growth of the dash network. Bitmain even felt obligated to clarify that they don’t’ disclose batch sizes conspicuously close to the release of our projection, indicating we were likely on the right track.
So if that’s the case, we know what the hash rate is. Roughly 4000 units every 3-4 weeks. Why bother projecting it? We’ll simulate fx rate independently and grow the hash linearly.
One thing to consider would be the pool size and the probability that the pool that the miner subscribes to finds x blocks per day, as it may be more or less than you might expect. To simplify the model, we assume that no one pool has advantages over another. You know, nobody is gaining an advantage through ASIC boost, mining empty blocks, or just generally severing the elegant symbiosis between capitalistic motivations and benefit of the network. Therefore, the long run equilibrium state is that pool distributions trend toward the hash reward being evenly distributed weighted on hash power.
This makes the math tremendously easier to work with, as we can allocate the daily rewards of the entire network by individual hash rate contribution. Now it’s continuous instead of discrete.
Capturing difficulty adjustments. Every 2048 blocks you’d adjust the difficulty at that snapshot, resulting in more accurate network behavior.
The next step is to fix our time series. We need to improve our data sources such that hash rate is imputed from block time directly observed from each respective blockchain. You’d then want to run some sort of smoothing algorithm to reduce the aforementioned volatility problem.
Once block time granularity is achieved, you’d want to project hash rate per block. You’d therefore need to add the independent random variable of block time, known officially as luck. It’s possible Miner #247 finds the solution with a lucky guess five whole minutes ahead of schedule. This would allow you to run paths with a time-step interval of a block, and truly capture all the possibilities that might unfold.
When you’re projecting an exchange rate, there are a few arbitrage-induced guardrails that allow you to discern certain properties. If you know how something moves with respect to something else (covariance) and you know something about that something else (mean, variance, etc…), then you have information that you didn’t have before.
The first set of additional information you could incorporate are other exchange rates. Right now, we simulate each cryptocurrency independently, with respect to USD. If you were to combine the model, you may get better results. As many viewers of this model may not be based in USD, you could also extend this model to include other fiat exchange rates such as EUR, GBP, or JPY. Triangle trade arbitrage exerts pressure for currency pairs to adhere to certain behavior be it in crypto/crypto, crypto/fiat, or fiat/fiat.
If you’re modeling an fx pair that includes fiat then you could also take into account global interest rates, which incidentally could also be used to discount future cash flows. Interest rate parity ensures similar such behavior for fiat currencies and the mother country that controls its supply.
Things we could incorporate but willfully chose to omit:
For profitability sake, there’s a few other considerations including:
- Pool Fees (see above)
- Transaction fees (on the network)
- Transaction fees when converting to USD
- Bitcoin halving
- Other currency FX rates, for those of you living outside the US. I seem to recall that you can find this data from the Federal Reserve Bank of St. Louis. Note that this information would need to be simulated as well and added to the covariance matrix.
- You could include interest rates if you wanted to present value the investment. You can find interest rates (ones literally close enough for government work) from ISDA who has to make them public for Credit Default Swap standardization models. Thank the credit crisis.
If there’s interest, we may extend this model in the future to incorporate the assumptions above.
 Assuming the fx rate > 0. Here’s an exercise left to the reader: could there exist a scenario where fx rate < 0?
 …probably. It could technically be the same value between measurements
 Which may or may not have a hardware efficiency baked in that’s only usable on one chain if said efficiency is to stay ambiguous.
 Though you might say that in making that projection and writing this blog, we’ve altered the state by causing a shift in production or distribution strategy.
 Until the inevitable introduction of cryptocurrency debt.