Bitcoin Q&A: What is difficulty targeting?

“Reviewing the material about mining on
slides 20 and 21, the last step refers to…
the hash being reviewed against the desired pattern
in order to arrive at the prize, the block reward.”
“How is this ‘desired pattern’ defined and
generated in the decentralized platform?”
“Is the desired pattern somehow
centralized and broadcast to each node?”
“What are the inputs to build this desired pattern?
[Does the pattern] change with each new block?”
Great questions. This is an area many of
our students find confusing, to say the least.
It is the topic of mining, which is not that easy
to understand. Let’s get our terms correct first.
The desired pattern is called the ‘target.’
The target defines the difficulty [for the network].
Through the proof-of-work algorithm, if a miner
achieves a result that is less than the target,
they are eligible to receive [the block reward], if the
information in the block and transactions are valid.
[They are] validated through the
consensus rules by [every other node].
How is the target defined and how does it change?
This is a great question and an
area of confusion [to students].
The target is a number [that must]
be greater than the hash of the block.
It is simply a ‘greater than, less than’ operator being
used to compare [hashes] against the desired pattern.
The miners are mining by hashing the header of each
block. The hash they are producing, which looks like…
a long string of hexadecimal digits,
is essentially just a number.
If you think of the hash as a number,
then the target is another number.
The hash of a block [must] be less than the target.
One way I like to [illustrate] this: the target is like limbo,
where you have to dance and pass underneath this bar.
The lower the bar gets, the harder it is to
pass underneath it for each limbo dancer.
If the target is lowered, it is actually harder to
find a number that is smaller than that target.
Every time the target gets lower, the difficulty becomes
greater, because it is harder to find a number that fits.
That is the process by which the [difficulty]
target is compared to the block hash.
The target is a number that defines the difficulty
of the proof-of-work mining algorithm.
If you [look at] the target, what you notice immediately
is that the first few digits are zeroes.
While the number started very high back in 2009,
when Satoshi Nakamoto mined the first block,
that number has now become billions of times smaller,
making the calculation billions of times more difficult.
As it becomes a smaller number, that means
[more] leading digits of that number are zeroes.
For example, let’s [think of] a big number.
What is smaller than one million?
Nine hundred and ninety-nine thousand,
nine hundred and ninety-nine is smaller,
That can be written as ‘0999999.’
What is smaller than ‘0999999’?
One thousand is smaller, written as ‘0001000.’ As we
go down, there are [more] zeroes at the beginning.
Finding a number even smaller than that target
[becomes] more difficult, the smaller the target.
The hashing process that miners conduct is random;
they use a random number [generator to] produce
a hash, and you can’t predict what it will be.
How do you know if it is smaller? You can’t predict
whether it [will be] smaller than the target.
In order to find a number that is smaller than the target,
you [must] just keep trying again and again,
pulling out random numbers from the cryptographic
hash function, until one of them — by sheer chance —
is smaller than the [difficulty] target.
The lower the target, the more hashing you [must]
do before you can find one smaller than the target.
That is the process by which the [block] reward
is allocated, through the proof-of-work algorithm.
Going back to Roberta’s question, “Is the desired pattern
somehow centralized and broadcast to each node?”
No. Each node independently calculates
what the target should be and adjusts it.
It started with a specific number that was
hard-coded [with the genesis block] in January 2009,
Since then, every 2016 or approximately two weeks,
we have a “re-targeting” as it is called.
Every 2016th block exactly, every node in the network
calculates a new target for the [next 2016 blocks].
They [look at the latest] 2015 blocks,
[see] that the next block will be 2016, and [know
they] have completed [another] re-targeting period.
[They] independently re-calculate what the target should
be for [the next 2016 blocks], before 2016 is mined.
What should that be? Let’s look at the previous 2016
blocks and see how long those [took] to be mined.
It should take 20,160 minutes [total]
because [issuance] is ten minutes per block.
If we count how long it actually took
to mine the previous 2016 blocks,
and we find that it [took] less than 20,160 minutes,
we were [mining] blocks faster than we should [have].
The difficulty was not [great enough]. It was too easy.
The target [should] be lowered proportionately,
in order to make [the difficulty greater].
If it [took] longer than 20,160 minutes for [mining]
2016 blocks, that means it was too difficult.
We [were] finding blocks too slowly, and so
the target is [increased] to make it easier.
Again, that is done proportionately.
The formula is to proportionately adjust the target up or
down by a ratio of how long it took to mine 2016 blocks,
based on how long it should take to find 2016 blocks,
which is 20,160 minutes [in about a two-week period].
That proportionate adjustment is the same for every
node, even though they are not coordinating.
They can all count how long it took to [mine] the
previous 2016 blocks, [which will be] the same number…
across all of the nodes, because they count
by looking at the times in the block headers.
They can also divide that number by 20,160 minutes,
and they will arrive at the same exact result.
If they multiply the target by that proportion,
they will have calculated a new target.
All of the nodes in the network, having calculated
the same inputs with the same equation,
will arrive at the same conclusion.
They will independently figure out what the target
should be for the next block in the series;
then 2016 blocks later, they will do it again, with the
same inputs in the same equation [for re-targeting].
Even though there is no synchronisation, they are using
the same inputs and all arrive at the same conclusion.
That comes the consensus [difficulty] target.
Even if a node is lying and says they
found a block [with a different target],
since all nodes know what the target should be for this
[block] period, they will all check [blocks] against [it].
They will only accept a block if it has been mined to that
specification, with the block hash is less than the target.
To answer the second question, “What are
the inputs to build this desired pattern?”
The number of minutes to mine the previous 2016
blocks, divided by the expected number of minutes.
The next question was, “Is there a new pattern
created whenever a new block is [mined]?”
“Does it change every block?” No, it changes every
re-targeting period: 2016 blocks, [or about] two weeks.
“Is 2016 blocks still an optimal adjustment [period]
considering volatility, or should it be more frequent?”
This is an ongoing debate which a lot of developers
in the Bitcoin community have from time to time.
There have been many suggestions for changing the
difficulty re-targeting algorithm, to make it more nimble.
There are disadvantages to making it more frequent;
there can be a sort of whiplash effect…
where short-term fluctuations affect the difficulty,
causing more short-term fluctuations…
which can actually increase volatility.
By doing [a re-targeting] every two weeks, that results
in reducing volatility by acting as a damper.
Some developers have suggested more sophisticated
algorithms than simply a moving average.
For example, using a proportional-[integral]-derivative,
or PID controller, a feedback mechanism…
with a different window for a moving average;
a bit like how cruise control works in your car.
There are advantages and disadvantages to every
proposal. None of them have progressed [so far].
Keep in mind, that would require a hard fork and
changing [a lot] of software in the ecosystem,
and massive coordination to remain in consensus.
It might be considered if, together with other changes,
there was a change in the format of the block header.
[People would want] to do a big upgrade for a hard fork.
Some of the recommendations for hard fork planning
include changes [to the] difficulty adjustment algorithm.
“What happens when the [hash rate] drops lower,
it makes no financial sense, and [miners] drop out?”
When the difficulty changes or profitability changes,
it doesn’t affect all miners to the same [degree].
There are thousands of miners out there, operating
with a fairly broad variety of hashing equipment…
electricity prices, labor costs, utility costs, real-estate
costs, etc. all of which determine their profitability.
[It is] a range. [Some] miners operate on the very latest
ASICs, installed and managed in the most efficient way,
where real-estate is dirt cheap, electricity flows
almost freely, and labor costs are minimum.
Those miners will be wildly profitable at the
current difficulty, because they are not the average.
Meanwhile, on the other end of the scale, [some miners]
are operating with previous generation chips,
where real-estate, electricity and
labor costs are expensive, etc.
They will not be profitable. Average profitability
[or net zero] is obviously between those two.
If average profitability changes, that is a moving bar.
More miners will fall below the threshold where it becomes profitable.
The least profitable [among] the miners will
abandon the field, and be replaced by miners with…
more efficient equipment and [better] locations.
“Eventually, everyone drops out,” is not [what] happens.
If more [miners] drop out, difficulty goes down.
When difficulty goes down, it becomes more profitable
for people who [stay], so they don’t drop out [for long].
It is a self-adjusting process. The fewer
[miners], the easier [the difficulty] gets.
The more [miners with a lot of
hash power], the harder it gets.
There is always someone making a profit in this
environment, but not everyone makes a profit.

31 thoughts on “Bitcoin Q&A: What is difficulty targeting?

  1. I have a question if you don't mind. How can a dex work across chains? Is this even possible? I can understand how a smart-contract based dex can work, like in Ethereum or EOS, within a single ecosystem (tokens for tokens in ethereum, for example), because the smart contract can make an atomic swap, by having the transaction either done or undone. But let's say Bitcoin/Ether pair. How can a dex do this? Someone has to hold the private key where the bitcoins have to be deposited. If that key is available publicly (since we're talking about a public dex which is absolutely not controlled by any central authority), then anyone can steal the deposited coins. So is the idea of dex with atomic swaps across chains possible? Has anyone succeeded at this at all? If yes, please explain how this is done technically in the sense of: Who holds the private keys, or please tell me that my understanding is wrong. Thank you!

  2. What if the difficulty rises so much lets say 10x today's difficulty and then most of the miners drop out for any reason. Lets say 90% of the miners gone. Would the chain be in trouble of not being difficulty retargeted?

  3. Great explanation about mining difficulty. We will share this the next time somebody asks about how mining difficulty works. Thank you Andreas!

  4. Check out it shows when the difficulty retarget will occur (estimated date/time) and the estimated % change up or down.

  5. Another great video Andreas, please can you explain, to all the new people coming in, why it is harder to hash a smaller number as opposed to finding a lesser value under a reducing target range of numbers (the limbo dance, not to be confused with the lambo dance 🙂 )

    Muchas gracias amigo ;)))

  6. The difficulty retarget along side full blocks (when TX volume is high) and the futures market has created an attack vector on bitcoin BTC that didn't exist prior to December last year.
    When the network is clogged with transactions, and an upcoming difficulty increase is about to make it harder to mine a block(swings as high as +20% we're common in late 2017).
    "If" miner/s chooses to spam the network immediately after the difficulty adjustment and then shift mining hash to another chain, there is a possibility of chain death. The miner has five advantages
    1. Bitcoin is unregulated so they do not need permission
    2. There resources would no longer be split across chains
    3. They can short the market from its high and as the network "stops" moving the price crashes and they make multiples on their position
    4. They have two weeks before difficulty would readjust
    5. They are the only market participants that can hold a long short position with little downside risk
    6. They would be holding the alternative token that gives them another upside.

  7. Fantastic explanation on mining difficulty. So what will happen when every single miner drops out? will the whole blockchain disappear? Please clarify as I'm not much into BTC mining. But I do mine DeepOnion, which is a privacy related cryptocurrency and is currently profitable for me.

  8. Newbie question: If the mining program is fooled into thinking that a long time has passed since the last result … then would it make it easier to mine compared to anyone else? how is this situation avoided?

  9. I am glad that you promote yourself on all social media! Your cryptocurrency promotion and input in the industry is vast. I hardly know anyone with the same expertise and experience. I really enjoy listening to your lessons. Concerning pow, dont you think its a bit dated, pos is the future. I also see that privacy coin industry is booming and becoming a very interesting investment opportunity. I specifically look for lower cap projects with big and hyped communities. Kinda see a lot of fuzz around DeepOnion, have to study it closely…

  10. Not to undermine your work but I'm pretty sure these questions have already been answered by yourself some time ago… Maybe it would be good to create an index of all the FAQs

  11. Excellent answer to the profitability question Andreas. I hear this question all the time but I am not educated enough to know how to answer it. Now I do lol! Keep educating us.

  12. Please make a video about 00000000000000000021e800c1e8df51b22c1588e5a624bea17e9faa34b2dc4a event

  13. Can you explain the 21e800 mystery block hash that was created recently. Is super quantum computing playing a role already in Bitcoin or might this be a message from Satoshi.

  14. Would have been handy to mention that finding a number lower than the target gets increasingly difficult the lower the target gets, based on the grounds of probability. The output of the hash function, although deterministic, is essentially probabilistic in terms of the hash value relative to the sample space — which is why zillions of mining machines are required to churn through the range of 'nonce' (ie the 'seed' that changes iteratively to process the entire sample space) values until the minimum target is obtained.

  15. Question about bots… Leaving vast amounts of money on the exchanges is, as we all know, a bad Idea. But when running a bot using an dex api keys, surely there is no other way, leaving your capital at risk?

Leave a Reply

Your email address will not be published. Required fields are marked *