Bitcoin - Probabilistic MapSince traders are literally made of particles, it's vital to know the principles of their behavior in micro scale. Some people even use planetary cycles to implement into charting. But I believe the answer is deep in quantum world of probabilities - the fabric of reality itself.
Reference to Quantum Mechanics
The universe itself prohibits 100% prediction accuracy. This is called Heisenberg Uncertainty Principle, and it's the fundamental building blocks of Quantum Mechanics. In order to predict particles behavior, all you need are just 2 quantities/data/features:
1) Position of the particle
2) Momentum of the particles.
If you know it's position and it's momentum, you can easily predict it's trajectory. So if you have position and momentum data of all particles in the universe, and you have unlimited computational power, you can predict their behavior (interaction, movement, etc.), and basically predict the future (stock market, weather, natural disaster, etc).
However, the Heisenberg Uncertainty Principle states that it is impossible to collect information of particles's position and momentum with 100% certainty. The more certain you know about particle's position, the less certain it's momentum" and vice versa.
So if somehow with the unlimited computational power you can predict particle's position at time with 100% accuracy, then your prediction error for its velocity will be infinity, which prevent you for making accurate further predictions, rendering your model useless.
Hence, it's theoretically impossible to make 100% accurate prediction even with unlimited data and unlimited computational power.
So Is The Universe deterministic or probabilistic?
100% prediction accuracy also means the universe is deterministic - there's only one possible outcome of the future. Einstein was on this side, citing "God doesn't play with dice". On the other hand, folks like Heisenberg, Max Born, Schrodinger, Oppenheimer, etc.., the founding fathers of Quantum Mechanics, viewed the future as set of possible outcomes each having it's own probability.
Since market couldn't care less about anyone's subjective forecasts, I do predictions solely based on historic price dynamics in macro scale to stay objective and true with the market pulse rather than be bared with my endless interpretations of patterns. I don't need my consciousness to interpret because we already have a data derived from collective consciousnesses to work with. Chart is already a reflection of reality that captures the emotions of participants. In other words, it's a time fractal that exposes the essence of the market across timeframes. In turn the market itself is a function of trading time . These basis justify linking systematic fragments of cycles to work out the capacity of price action. Basically in Fractal Analysis, the question is how can direct metrics of the historic waves geometrically explain current and future price levels.
The Fibonacci sequence is a mathematical concept that appears in various aspects of nature. This connection between mathematics and the natural world is a fascinating example of how patterns and structures found in abstract concepts like numbers can manifest in physical reality . Particularly, using Golden Ratio as a key rule that governs order in chaos.
In TradingView, the "Fibonacci Channels" is a great tool to capture the waves (domestic certainty) and turn them into a probabilistic interconnected structure that captures the uncertainty of the market - the entanglement of price action.
To start with it's vital to use log scale where percentages are equally captured in distances. So a 100% a growth, say a vertical distance from $40 to $80 measures the same distance as from $1000 to $2000. Besides, percentages are what drives people to feel emotions which affect market behavior (collective executions). Finding geometric relationship between waves, the use of log scale is a must.
As I've done this before I want to show how market deviates near fibs.
A Direction of 2013 HIGH ⇨ 2017 HIGH with bottom of 2011 gives next bottom 2015 at 0.618 after -86% drop.
And also predicts the COVID bottom in 2019 after -72% drop as well as current level where price has cooled down locally.
We can note that previous ATHs are explained with logarithmic curve.
That's why we'd need another fib channel to connect 2017 HIGH ⇨ 2021 HIGH direction with previous bottom of -86% drop in 2015. FC of that direction predicts bottoms of 2018 (-84%) and covid 2019 (-72%) at 0.618 again.
Together they produce an interference pattern covers significant historic price changes.
To further interpret current levels though the chart itself, we can use line with angle of direction connecting 2021 double tops:
This shows the capacity of how high the market might still grow before next significant correction, if the local fib to the price hasn't yet dimmed the bullish incentive.
Another straight line can be used to connect 2019 COVID LOW (-72%) with 2022 LOW, because we might probably never see such price levels in the nearest future as price has broken out with high rate of change.
Now it needs more time and bearish capacity to go there. This line can indicate the bottom of hypothetical correction, if it happens now. Other than that it's a clear trendline with almost 4Y wavelength.
Since straight lines doesn't exist in nature, I didn't extend them to the right. Now we need a more adaptive version of it to connect recent local bottoms of the trend.
That would be a logarithmic trendline, in other words curves to mimic the function of exponential growth. Therefore falling below it, might indicate a possibility of correction and even reversal. Each day if it fails to grow with the curve, the bears will get depleted. A cross below the logarithmic curve of spreading information would be a confirmation of new bearish incentive. This is simply done to work out boundaries as limits of the function that explains the market.
Corrective wave has a timing of 15 days in respect to its domestic volatility properties, before it becomes bearish impulsive or continues the impulsive bullish wave.
Curves as a function of trading time explain pretty much all historic bullrun growths.
As if there is some kind of gravity that governs the trend or it's the PriceTime that curves with the emerging trend.
Individual cycles can be too curved accordingly.
So the more the price fails to break out that function, the more predictive curve becomes.
Metrics
Q&As: non-market dataThere's some curious personalities that trade (at least claim to trade) based on news, fundamental metrics, alt data n stuff. I don't mean invest, I mean trade. Well that looks like a skill to be proud off, superstimuli always feels cool aye? Good thing tho there no real reason in doing it all.
The most precise term to explain non-market data is, well, everything that ain't have a direct involvement with what happens inside the order matching servers of a given exchange.
So open interest is in fact a great example of non-market data.
The one & only real purpose for using all this data is to know (not to guess/predict/forecast, not to even anticipate), but to understand when the ACTION is going to happen. If you think deeper, ultimately it's all about asset selection to satisfy whatever purpose you got. if you ever got caught yourself feeling fooled when media release a bad info but prices go up, or media release a good info but prices go down, it's ok. It doesn't work that way, direction of prices can't be affected this way. Direction of prices is the result of how buyers meet sellers which is based on +inf number of factors, where a non-market data is simply just one of these +inf factors. It exclusively provokes action, meat, hype, momentum, volatility, whatever you call it. What's happening is that things start to happen very fast. Without a trigger event, the trading activity would've been the same, it just would've take longer to unwind. News don't change the structure, they make it all happen faster, that's it.
Examples of non-market data that can be used to expect action:
1) Trading schedule, eg the US, EU opening times;
2) Economic releases;
3) Commitment of traders reports;
4) Significant news;
5) Changes in yield curves;
6) "Fundamental" stock data;
7) Open interest;
8) etc etc etc
One really important thing to add is that, just like trading activity is understood in context (other resolutions), sizing also includes context (equity control, market impact), the same way every non-market data event lives in the context (previous releases, other releases, overall economy). You're interesting not in a new per se, but rather in what does it mean in the world. For example, inflation reports don't mean much when the rates are low, but when the rates are high, they trigger significant activity.
That's the area where statistical learning, automated learning, "machine" learning, 'Really' starts to make sense business-wise. The ultimate goal is to create a system that will process every kind of data you have (NLP and TDA should help) and output the tickers with raising/already risen levels of interest.