Bitcoin Leverage Sentiment - Strategy [presentTrading]█ Introduction and How it is Different
The "Bitcoin Leverage Sentiment - Strategy " represents a novel approach in the realm of cryptocurrency trading by focusing on sentiment analysis through leveraged positions in Bitcoin. Unlike traditional strategies that primarily rely on price action or technical indicators, this strategy leverages the power of Z-Score analysis to gauge market sentiment by examining the ratio of leveraged long to short positions. By assessing how far the current sentiment deviates from the historical norm, it provides a unique lens to spot potential reversals or continuation in market trends, making it an innovative tool for traders who wish to incorporate market psychology into their trading arsenal.
BTC 4h L/S Performance
local
█ Strategy, How It Works: Detailed Explanation
🔶 Data Collection and Ratio Calculation
Firstly, the strategy acquires data on leveraged long (**`priceLongs`**) and short positions (**`priceShorts`**) for Bitcoin. The primary metric of interest is the ratio of long positions relative to the total of both long and short positions:
BTC Ratio=priceLongs / (priceLongs+priceShorts)
This ratio reflects the prevailing market sentiment, where values closer to 1 indicate a bullish sentiment (dominance of long positions), and values closer to 0 suggest bearish sentiment (prevalence of short positions).
🔶 Z-Score Calculation
The Z-Score is then calculated to standardize the BTC Ratio, allowing for comparison across different time periods. The Z-Score formula is:
Z = (X - μ) / σ
Where:
- X is the current BTC Ratio.
- μ is the mean of the BTC Ratio over a specified period (**`zScoreCalculationPeriod`**).
- σ is the standard deviation of the BTC Ratio over the same period.
The Z-Score helps quantify how far the current sentiment deviates from the historical norm, with high positive values indicating extreme bullish sentiment and high negative values signaling extreme bearish sentiment.
🔶 Signal Generation: Trading signals are derived from the Z-Score as follows:
Long Entry Signal: Occurs when the BTC Ratio Z-Score crosses above the thresholdLongEntry, suggesting bullish sentiment.
- Condition for Long Entry = BTC Ratio Z-Score > thresholdLongEntry
Long Exit/Short Entry Signal: Triggered when the BTC Ratio Z-Score drops below thresholdLongExit for exiting longs or below thresholdShortEntry for entering shorts, indicating a shift to bearish sentiment.
- Condition for Long Exit/Short Entry = BTC Ratio Z-Score < thresholdLongExit or BTC Ratio Z-Score < thresholdShortEntry
Short Exit Signal: Happens when the BTC Ratio Z-Score exceeds the thresholdShortExit, hinting at reducing bearish sentiment and a potential switch to bullish conditions.
- Condition for Short Exit = BTC Ratio Z-Score > thresholdShortExit
🔶Implementation and Visualization: The strategy applies these conditions for trade management, aligning with the selected trade direction. It visualizes the BTC Ratio Z-Score with horizontal lines at entry and exit thresholds, illustrating the current sentiment against historical norms.
█ Trade Direction
The strategy offers flexibility in trade direction, allowing users to choose between long, short, or both, depending on their market outlook and risk tolerance. This adaptability ensures that traders can align the strategy with their individual trading style and market conditions.
█ Usage
To employ this strategy effectively:
1. Customization: Begin by setting the trade direction and adjusting the Z-Score calculation period and entry/exit thresholds to match your trading preferences.
2. Observation: Monitor the Z-Score and its moving average for potential trading signals. Look for crossover events relative to the predefined thresholds to identify entry and exit points.
3. Confirmation: Consider using additional analysis or indicators for signal confirmation, ensuring a comprehensive approach to decision-making.
█ Default Settings
- Trade Direction: Determines if the strategy engages in long, short, or both types of trades, impacting its adaptability to market conditions.
- Timeframe Input: Influences signal frequency and sensitivity, affecting the strategy's responsiveness to market dynamics.
- Z-Score Calculation Period: Affects the strategy’s sensitivity to market changes, with longer periods smoothing data and shorter periods increasing responsiveness.
- Entry and Exit Thresholds: Set the Z-Score levels for initiating or exiting trades, balancing between capturing opportunities and minimizing false signals.
- Impact of Default Settings: Provides a balanced approach to leverage sentiment trading, with adjustments needed to optimize performance across various market conditions.
Statistics
Moving Average PropertiesThis indicator calculates and visualizes the Relative Smoothness (RS) and Relative Lag (RL) or call it accuracy of a selected moving average (MA) in comparison to the SMA of length 2 (the lowest possible length for a moving average and also the one closest to the price).
Median RS (Relative Smoothness):
Interpretation: The median RS represents the median value of the Relative Smoothness calculated for the selected moving average across a specified look-back period (max bar lookback is set at 3000).
Significance: A more negative (larger) median RS suggests that the chosen moving average has exhibited smoother price behavior compared to a simple moving average over the analyzed period. A less negative value indicates a relatively choppier price movement.
Median RL (Relative Lag):
Interpretation: The median RL represents the median value of the Relative Lag calculated for the selected moving average compared to a simple moving average of length 2.
Significance: A higher median RL indicates that the chosen moving average tends to lag more compared to a simple moving average. Conversely, lower values suggest less lag in the selected moving average.
Ratio of Median RS to Median RL:
Interpretation: This ratio is calculated by dividing the median RS by the median RL.
Significance: Traders might use this ratio to assess the balance between smoothness and lag in the chosen moving average. This a measure of for every % of lag what is the smoothness achieved. This can be used a benchmark to decide what length to choose for a MA to get an equivalent value between two stocks. For example a TESLA stock on a 15 minute time frame with a length of 12 has a value (ratio of RS/RL) of -150 , where as APPLE stock of length 35 on a 15 minute chart also has a value (ratio of RS/RL) of -150.
I imply that a MA of length 12 working on TESLA stock is equivalent to MA of length 35 on a APPLE stock. (THIS IS A EXAMPLE).
My assumption is that finding the right moving average length for a stock isn't a one-size-fits-all situation. It's not just about using a fixed length; it's about adapting to the unique characteristics of each stock. I believe that what works for one stock might not work for another because they have different levels of smoothness or lag in their price movements. So, instead of applying the same length to all stocks, I suggest adjusting the length of the moving average to match the values that we know work best for achieving the desired smoothness or lag or its ratio (RS/RL). This way, we're customizing the indicator for each stock, tailoring it to their individual behaviors rather than sticking to a one-size-fits-all approach.
Users can choose from various types of moving averages (EMA, SMA, WMA, VWMA, HMA) and customize the length of the moving average. RS measures the smoothness of the MA, while RL measures its lag compared to a simple moving average. The script plots the median RS and RL values, the selected MA, and the ratio of median RS to median RL on the price chart. Traders can use this information to assess the performance of different moving averages and potentially inform their trading decisions.
ATR Grid Levels [By MUQWISHI]▋ INTRODUCTION :
The “ATR Levels” produces a sequence of horizontal line levels above and below the Center Line (reference level). They are sized based on the instrument's volatility, representing the average historical price movement on a selected higher timeframe using the average true range (ATR) indicator.
_______________________
▋ OVERVIEW:
_______________________
▋ IMPLEMENTATION:
The indicator starts by drawing a Center Line that is selected by the user from a variety of common levels. Then, it draws a sequence of horizontal lines above and below the Center Line, which are sized based on the most confirmed average true range (ATR) at the selected higher timeframe.
In the top right corner of the chart, there is a table displaying both the selected ATR (in the right cell) and the ATR of the current bar (in the left cell). This feature enables users to compare these two values. It's important to note that the ATR of the current bar may not be confirmed yet, as the market is still active.
_______________________
▋ INDICATOR SETTINGS:
# Section (1): ATR Settings
(1) ATR Period & Smoothing.
(2) Timeframe where ATR value imported from.
(3) To show/hide the table comparison between the current ATR and the ATR for the selected period. Also, ability to color the current ATR cell if it’s greater.
# Section (2): Levels Settings
(1) Selecting a Center Line level among a variety of common levels, which is taken as reference level where a sequence of horizontal lines plot above and below it.
(2) Size of grid in ATR unit.
(3) Number of horizontal lines to plot in a single side.
(4) Grid Side. Ability to plot above or below the Center Line.
(5) Lines colors, and mode.
(6) Line style.
(7) Label style.
(8) Ability to remove old lines, from previous HTF.
_____________________
▋ COMMENT:
The ATR Levels should not be taken as a major concept to build a trading decision.
Please let me know if you have any questions.
Thank you.
Octopus Nest Strategy Hello Fellas,
Hereby, I come up with a popular strategy from YouTube called Octopus Nest Strategy. It is a no repaint, lower timeframe scalping strategy utilizing PSAR, EMA and TTM Squeeze.
The strategy considers these market factors:
PSAR -> Trend
EMA -> Trend
TTM Squeeze -> Momentum and Volatility by incorporating Bollinger Bands and Keltner Channels
Note: As you can see there is a potential improvement by incorporating volume.
What's Different Compared To The Original Strategy?
I added an option which allows users to use the Adaptive PSAR of @loxx, which will hopefully improve results sometimes.
Signals
Enter Long -> source above EMA 100, source crosses above PSAR and TTM Squeeze crosses above 0
Enter Short -> source below EMA 100, source crosses below PSAR and TTM Squeeze crosses below 0
Exit Long and Exit Short are triggered from the risk management. Thus, it will just exit on SL or TP.
Risk Management
"High Low Stop Loss" and "Automatic High Low Take Profit" are used here.
High Low Stop Loss: Utilizes the last high for short and the last low for long to calculate the stop loss level. The last high or low gets multiplied by the user-defined multiplicator and if no recent high or low was found it uses the backup multiplier.
Automatic High Low Take Profit: Utilizes the current stop loss level of "High Low Stop Loss" and gets calculated by the user-defined risk ratio.
Now, follows the bunch of knowledge for the more inexperienced readers.
PSAR: Parabolic Stop And Reverse; Developed by J. Welles Wilders and a classic trend reversal indicator.
The indicator works most effectively in trending markets where large price moves allow traders to capture significant gains. When a security’s price is range-bound, the indicator will constantly be reversing, resulting in multiple low-profit or losing trades.
TTM Squeeze: TTM Squeeze is a volatility and momentum indicator introduced by John Carter of Trade the Markets (now Simpler Trading), which capitalizes on the tendency for price to break out strongly after consolidating in a tight trading range.
The volatility component of the TTM Squeeze indicator measures price compression using Bollinger Bands and Keltner Channels. If the Bollinger Bands are completely enclosed within the Keltner Channels, that indicates a period of very low volatility. This state is known as the squeeze. When the Bollinger Bands expand and move back outside of the Keltner Channel, the squeeze is said to have “fired”: volatility increases and prices are likely to break out of that tight trading range in one direction or the other. The on/off state of the squeeze is shown with small dots on the zero line of the indicator: red dots indicate the squeeze is on, and green dots indicate the squeeze is off.
EMA: Exponential Moving Average; Like a simple moving average, but with exponential weighting of the input data.
Don't forget to check out the settings and keep it up.
Best regards,
simwai
---
Credits to:
@loxx
@Bjorgum
@Greeny
Machine Learning Cross-Validation Split & Batch HighlighterThis indicator is designed for traders and analysts who employ Machine Learning (ML) techniques for cross-validation in financial markets.
The script visually segments a selected range of historical price data into splits and batches, helping in the assessment of model performance over different market conditions.
User
Theory
In ML, cross-validation is a technique to assess the generalizability of a model, typically by partitioning the data into a set of "folds" or "splits." Each split acts as a validation set, while the others form the training set. This script takes a unique approach by considering the sequential nature of financial time series data, where random shuffling of data (as in traditional cross-validation) can disrupt the temporal order, leading to misleading results.
Chronological Integrity of Splits
Even if the order of the splits is shuffled for cross-validation purposes, the data within each split remains in its original chronological sequence. This feature is crucial for time series analysis, as it respects the inherent order-dependency of financial markets. Thus, each split can be considered a microcosm of market behavior, maintaining the integrity of trends, cycles, and patterns that could be disrupted by random sampling.
The script allows users to define the number of splits and the size of each batch within a split. By doing so, it maintains the chronological sequence of the data, ensuring that the validation set is representative of a future time period that the model would predict.
www.tradingview.com
Parameters
Number of Splits: Defines how many segments the selected data range will be divided into. Each split serves as a standalone testing ground for the ML model. (Up to 24)
Batch Size: Determines the number of bars (candles) in each batch within a split. Smaller batches can help pinpoint overfitting at a finer granularity.
Start Index: The bar index from where the historical data range begins. It sets the starting point for data analysis.
End Index: The bar index where the historical data range ends. It marks the cutoff for data to be included in the model assessment.
Usage
To use this script effectively:
1 - Input the Start Index and End Index to define the historical data range you wish to analyze.
2 - Adjust the Number of Splits to create multiple validation sets for cross-validation.
3 - Set the Batch Size to control the granularity of each validation set within the splits.
4 - The script will highlight the background of each batch within the splits using alternating shades, allowing for a clear visual distinction of the data segmentation.
By maintaining the temporal sequence and allowing for adjustable granularity, the "ML Split and Batch Highlighter" aids in creating a robust validation framework for time series forecasting models in finance.
Bandwidth Volatility - Silverman Rule of thumb EstimatorOverview
This indicator calculates volatility using the Rule of Thumb bandwidth estimator and incorporating the standard deviations of returns to get historical volatility. There are two options: one for the original rule of thumb bandwidth estimator, and another for the modified rule of thumb estimator. This indicator comes with the bandwidth , which is shown with the color gradient columns, which are colored by a percentile of the bandwidth, and the moving average of the bandwidth, which is the dark shaded area.
The rule of thumb bandwidth estimator is a simple and quick method for estimating the bandwidth parameter in kernel density estimation (KSE) or kernel regression. It provides a rough approximation of the bandwidth without requiring extensive computation resources or fine-tuning. One common rule of thumb estimator is Silverman rule, which is given by
h = 1.06*σ*n^(-1/5)
where
h is the bandwidth
σ is the standard deviation of the data
n is the number of data points
This rule of thumb is based on assuming a Gaussian kernel and aims to strike a balance between over-smoothing and under-smoothing the data. It is simple to implement and usually provides reasonable bandwidth estimates for a wide range of datasets. However , it is important to note that this rule of thumb may not always have optimal results, especially for non-Gaussian or multimodal distributions. In such cases, a modified bandwidth selection, such as cross-validation or even applying a log transformation (if the data is right-skewed), may be preferable.
How it works:
This indicator computes the bandwidth volatility using returns, which are used in the standard deviation calculation. It then estimates the bandwidth based on either the Silverman rule of thumb or a modified version considering the interquartile range. The percentile ranks of the bandwidth estimate are then used to visualize the volatility levels, identify high and low volatility periods, and show them with colors.
Modified Rule of thumb Bandwidth:
The modified rule of thumb bandwidth formula combines elements of standard deviations and interquartile ranges, scaled by a multiplier of 0.9 and inversely with a number of periods. This modification aims to provide a more robust and adaptable bandwidth estimation method, particularly suitable for financial time series data with potentially skewed or heavy-tailed data.
Formula for Modified Rule of Thumb Bandwidth:
h = 0.9 * min(σ, (IQR/1.34))*n^(-1/5)
This modification introduces the use of the IQR divided by 1.34 as an alternative to the standard deviation. It aims to improve the estimation, mainly when the underlying distribution deviates from a perfect Gaussian distribution.
Analysis
Rule of thumb Bandwidth: Provides a broader perspective on volatility trends, smoothing out short-term fluctuations and focusing more on the overall shape of the density function.
Historical Volatility: Offers a more granular view of volatility, capturing day-to-day or intra-period fluctuations in asset prices and returns.
Modelling Requirements
Rule of thumb Bandwidth: Provides a broader perspective on volatility trends, smoothing out short-term fluctuations and focusing more on the overall shape of the density function.
Historical Volatility: Offers a more granular view of volatility, capturing day-to-day or intra-period fluctuations in asset prices and returns.
Pros of Bandwidth as a volatility measure
Robust to Data Distribution: Bandwidth volatility, especially when estimated using robust methods like Silverman's rule of thumb or its modifications, can be less sensitive to outliers and non-normal distributions compared to some other measures of volatility
Flexibility: It can be applied to a wide range of data types and can adapt to different underlying data distributions, making it versatile for various analytical tasks.
How can traders use this indicator?
In finance, volatility is thought to be a mean-reverting process. So when volatility is at an extreme low, it is expected that a volatility expansion happens, which comes with bigger movements in price, and when volatility is at an extreme high, it is expected for volatility to eventually decrease, leading to smaller price moves, and many traders view this as an area to take profit in.
In the context of this indicator, low volatility is thought of as having the green color, which indicates a low percentile value, and also being below the moving average. High volatility is thought of as having the yellow color and possibly being above the moving average, showing that you can eventually expect volatility to decrease.
Likelihood of Winning - Probability Density FunctionIn developing the "Likelihood of Winning - Probability Density Function (PDF)" indicator, my aim was to offer traders a statistical tool to quantify the probability of reaching target prices. This indicator, grounded in risk assessment principles, enables users to analyze potential outcomes based on the normal distribution, providing insights into market dynamics.
The tool's flexibility allows for customization of the data series, lookback periods, and target settings for both long and short scenarios. It features a color-coded visualization to easily distinguish between probabilities of hitting specified targets, enhancing decision-making in trading strategies.
I'm excited to share this indicator with the trading community, hoping it will enhance data-driven decision-making and offer a deeper understanding of market risks and opportunities. My goal is to continuously improve this tool based on user feedback and market evolution, contributing to more informed trading practices.
This indicator leverages the "NormalDistributionFunctions" library, enabling easy integration into other indicators or strategies. Users can readily embed advanced statistical analysis into their trading tools, fostering innovation within the Pine Script community.
NormalDistributionFunctionsLibrary "NormalDistributionFunctions"
The NormalDistributionFunctions library encompasses a comprehensive suite of statistical tools for financial market analysis. It provides functions to calculate essential statistical measures such as mean, standard deviation, skewness, and kurtosis, alongside advanced functionalities for computing the probability density function (PDF), cumulative distribution function (CDF), Z-score, and confidence intervals. This library is designed to assist in the assessment of market volatility, distribution characteristics of asset returns, and risk management calculations, making it an invaluable resource for traders and financial analysts.
meanAndStdDev(source, length)
Calculates and returns the mean and standard deviation for a given data series over a specified period.
Parameters:
source (float) : float: The data series to analyze.
length (int) : int: The lookback period for the calculation.
Returns: Returns an array where the first element is the mean and the second element is the standard deviation of the data series for the given period.
skewness(source, mean, stdDev, length)
Calculates and returns skewness for a given data series over a specified period.
Parameters:
source (float) : float: The data series to analyze.
mean (float) : float: The mean of the distribution.
stdDev (float) : float: The standard deviation of the distribution.
length (int) : int: The lookback period for the calculation.
Returns: Returns skewness value
kurtosis(source, mean, stdDev, length)
Calculates and returns kurtosis for a given data series over a specified period.
Parameters:
source (float) : float: The data series to analyze.
mean (float) : float: The mean of the distribution.
stdDev (float) : float: The standard deviation of the distribution.
length (int) : int: The lookback period for the calculation.
Returns: Returns kurtosis value
pdf(x, mean, stdDev)
pdf: Calculates the probability density function for a given value within a normal distribution.
Parameters:
x (float) : float: The value to evaluate the PDF at.
mean (float) : float: The mean of the distribution.
stdDev (float) : float: The standard deviation of the distribution.
Returns: Returns the probability density function value for x.
cdf(x, mean, stdDev)
cdf: Calculates the cumulative distribution function for a given value within a normal distribution.
Parameters:
x (float) : float: The value to evaluate the CDF at.
mean (float) : float: The mean of the distribution.
stdDev (float) : float: The standard deviation of the distribution.
Returns: Returns the cumulative distribution function value for x.
confidenceInterval(mean, stdDev, size, confidenceLevel)
Calculates the confidence interval for a data series mean.
Parameters:
mean (float) : float: The mean of the data series.
stdDev (float) : float: The standard deviation of the data series.
size (int) : int: The sample size.
confidenceLevel (float) : float: The confidence level (e.g., 0.95 for 95% confidence).
Returns: Returns the lower and upper bounds of the confidence interval.
ApproximateGaussianSmoothingLibrary "ApproximateGaussianSmoothing"
This library provides a novel smoothing function for time-series data, serving as an alternative to SMA and EMA. Additionally, it provides some statistical processing, using moving averages as expected values in statistics.
'Approximate Gaussian Smoothing' (AGS) is designed to apply weights to time-series data that closely resemble Gaussian smoothing weights. it is easier to calculate than the similar ALMA.
In case AGS is used as a moving average, I named it 'Approximate Gaussian Weighted Moving Average' (AGWMA).
The formula is:
AGWMA = (EMA + EMA(EMA) + EMA(EMA(EMA)) + EMA(EMA(EMA(EMA)))) / 4
The EMA parameter alpha is 5 / (N + 4) , using time period N (or length).
ma(src, length)
Calculate moving average using AGS (AGWMA).
Parameters:
src (float) : Series of values to process.
length (simple int) : Number of bars (length).
Returns: Moving average.
analyse(src, length)
Calculate mean and variance using AGS.
Parameters:
src (float) : Series of values to process.
length (simple int) : Number of bars (length).
Returns: Mean and variance.
analyse(dimensions, sources, length)
Calculate mean and variance covariance matrix using AGS.
Parameters:
dimensions (simple int) : Dimensions of sources to process.
sources (array) : Series of values to process.
length (simple int) : Number of bars (length).
Returns: Mean and variance covariance matrix.
trend(src, length)
Calculate intercept (LSMA) and slope using AGS.
Parameters:
src (float) : Series of values to process.
length (simple int) : Number of bars (length).
Returns: Intercept and slope.
Bandwidth Bands - Silverman's rule of thumbWhat are Bandwidth Bands?
This indicator uses Silverman Rule of Thumb Bandwidth to estimate the width of bands around the rolling moving average which takes in the log transformation of price to remove most of price skewness for the rest of the volatility calculations and then a exp() function is performed to convert it back to a right skewed distribution. These bandwidths bands could offer insights into price volatility and trading extremes.
Silverman rule of thumb bandwidth:
The Silverman Rule of Thumb Bandwidth is a heuristic method used to estimate the optimal bandwidth for kernel density estimation, a statistical technique for estimating the probability density function of a random variable. In the context of financial analysis, such as in this indicator, it helps determine the width of bands around a moving average, providing insights into the level of volatility in the market. This method is particularly useful because it offers a quick and straightforward way to estimate bandwidth without requiring extensive computational resources or complex mathematical calculation
The bandwidth estimator automatically adjust to the characteristics of the data, providing a flexible and dynamic measure of dispersion that can capture variations in volatility over time. Standard deviations alone may not be as adaptive to changes in data distributions. The Bandwidth considers the overall shape and structure of the data distribution rather than just focusing on the spread of data points.
Settings
Source
Sample length
1-4 SD options to disable or enable each band
FVG Detector LibraryLibrary "FVG Detector Library"
🔵 Introduction
To save time and improve accuracy in your scripts for identifying Fair Value Gaps (FVGs), you can utilize this library. Apart from detecting and plotting FVGs, one of the most significant advantages of this script is the ability to filter FVGs, which you'll learn more about below. Additionally, the plotting of each FVG continues until either a new FVG occurs or the current FVG is mitigated.
🔵 Definition
Fair Value Gap (FVG) refers to a situation where three consecutive candlesticks do not overlap. Based on this definition, the minimum conditions for detecting a fair gap in the ascending scenario are that the minimum price of the last candlestick should be greater than the maximum price of the third candlestick, and in the descending scenario, the maximum price of the last candlestick should be smaller than the minimum price of the third candlestick.
If the filter is turned off, all FVGs that meet at least the minimum conditions are identified. This mode is simplistic and results in a high number of identified FVGs.
If the filter is turned on, you have four options to filter FVGs :
1. Very Aggressive : In addition to the initial condition, another condition is added. For ascending FVGs, the maximum price of the last candlestick should be greater than the maximum price of the middle candlestick. Similarly, for descending FVGs, the minimum price of the last candlestick should be smaller than the minimum price of the middle candlestick. In this mode, a very small number of FVGs are eliminated.
2. Aggressive : In addition to the conditions of the Very Aggressive mode, in this mode, the size of the middle candlestick should not be small. This mode eliminates more FVGs compared to the Very Aggressive mode.
3. Defensive : In addition to the conditions of the Very Aggressive mode, in this mode, the size of the middle candlestick should be relatively large, and most of it should consist of the body. Also, for identifying ascending FVGs, the second and third candlesticks must be positive, and for identifying descending FVGs, the second and third candlesticks must be negative. In this mode, a significant number of FVGs are eliminated, and the remaining FVGs have a decent quality.
4. Very Defensive : In addition to the conditions of the Defensive mode, the first and third candlesticks should not resemble very small-bodied doji candlesticks. In this mode, the majority of FVGs are filtered out, and the remaining ones are of higher quality.
By default, we recommend using the Defensive mode.
🔵 How to Use
🟣 Parameters
To utilize this library, you need to provide four input parameters to the function.
"FVGFilter" determines whether you wish to apply a filter on FVGs or not. The possible inputs for this parameter are "On" and "Off", provided as strings.
"FVGFilterType" determines the type of filter to be applied to the found FVGs. These filters include four modes: "Very Defensive", "Defensive", "Aggressive", and "Very Aggressive", respectively exhibiting decreasing sensitivity and indicating a higher number of Fair Value Gaps (FVG).
The parameter "ShowDeFVG" is a Boolean value defined as either "true" or "false". If this value is "true", FVGs are shown during the Bullish Trend; however, if it is "false", they are not displayed.
The parameter "ShowSuFVG" is a Boolean value defined as either "true" or "false". If this value is "true", FVGs are displayed during the Bearish Trend; however, if it is "false", they are not displayed.
FVGDetector(FVGFilter, FVGFilterType, ShowDeFVG, ShowSuFVG)
Parameters:
FVGFilter (string)
FVGFilterType (string)
ShowDeFVG (bool)
ShowSuFVG (bool)
🟣 Import Library
You can use the "FVG Detector" library in your script using the following expression:
import TFlab/FVGDetectorLibrary/1 as FVG
🟣 Input Parameters
The descriptions related to the input parameters were provided in the "Parameter" section. In this section, for your convenience, the code related to the inputs is also included, and you can copy and paste it into your script.
PFVGFilter = input.string('On', 'FVG Filter', )
PFVGFilterType = input.string('Defensive', 'FVG Filter Type', )
PShowDeFVG = input.bool(true, ' Show Demand FVG')
PShowSuFVG = input.bool(true, ' Show Supply FVG')
🟣 Call Function
You can copy the following code into your script to call the FVG function. This code is based on the naming conventions provided in the "Input Parameter" section, so if you want to use exactly this code, you should have similar parameter names or have copied the "Input Parameter" values.
FVG.FVGDetector(PFVGFilter, PFVGFilterType, PShowDeFVG, PShowSuFVG)
Dynamic Momentum GaugeOverview
The Dynamic Momentum Gauge is an indicator designed to provide information and insights into the trend and momentum of a financial asset. While this indicator is not directional , it helps you know when there will be a trend, big move, or when momentum will have a run, and when you should take profits.
How It Works
This indicator calculates momentum and then removes the negative values to focus instead on when the big trend could likely happen and when it could end, or when you should enter a trade based on momentum or exit. Traders can basically use this indicator to time their market entries or exits, and align their strategies with momentum dynamics.
How To Use
As previously mentioned, this is not a directional indicator but more like a timing indicator. This indicator helps you find when the trend moves, and big moves in the markets will occur and its possibly best to exit the trades. For example, if you decide to enter a long trade if the Dynamic Momentum Gauge value is at an extreme low and another momentum indicator that you use has conditions that you would consider to long with, then this indicator is basically telling you that there isn't more space for the momentum to squeeze any longer, can only really expand from that point or stay where it currently is, but this is also a mean reverting process so it does tend to go back up from the low point.
Settings:
Length: This is the length of the momentum, by default its at 100.
Normalization Length: Length of the Normalization which ensures the the values fall within a consistent range.
Optimal Buy Day (Zeiierman)█ Overview
The Optimal Buy Day (Zeiierman) indicator identifies optimal buying days based on historical price data, starting from a user-defined year. It simulates investing a fixed initial capital and making regular monthly contributions. The unique aspect of this indicator involves comparing systematic investment on specific days of the month against a randomized buying day each month, aiming to analyze which method might yield more shares or a better average price over time. By visualizing the potential outcomes of systematic versus randomized buying, traders can better understand the impact of market timing and how regular investments might accumulate over time.
These statistics are pivotal for traders and investors using the script to analyze historical performance and strategize future investments. By understanding which days offered more shares for their money or lower average prices, investors can tailor their buying strategies to potentially enhance returns.
█ Key Statistics
⚪ Shares
Definition: Represents the total number of shares acquired on a particular day of the month across the entire simulation period.
How It Works: The script calculates how many shares can be bought each day, given the available capital or monthly contribution. This calculation takes into account the day's opening price and accumulates the total shares bought on that day over the simulation period.
Interpretation: A higher number of shares indicates that the day consistently offered better buying opportunities, allowing the investor to acquire more shares for the same amount of money. This metric is crucial for understanding which days historically provided more value.
⚪ AVG Price
Definition: The average price paid per share on a particular day of the month, averaged over the simulation period.
How It Works: Each time shares are bought, the script calculates the average price per share, factoring in the new shares purchased at the current price. This average evolves over time as more shares are bought at varying prices.
Interpretation: The average price gives insight into the cost efficiency of buying shares on specific days. A lower average price suggests that buying on that day has historically led to better pricing, making it a potentially more attractive investment strategy.
⚪ Buys
Definition: The total number of transactions or buys executed on a particular day of the month throughout the simulation.
How It Works: This metric increments each time shares are bought on a specific day, providing a count of all buying actions taken.
Interpretation: The number of buys indicates the frequency of investment opportunities. A higher count could mean more consistent opportunities for investment, but it's important to consider this in conjunction with the average price and the total shares acquired to assess overall strategy effectiveness.
⚪ Most Shares
Definition: Identifies the day of the month on which the highest number of shares were bought, highlighting the specific day and the total shares acquired.
How It Works: After simulating purchases across all days of the month, the script identifies which day resulted in the highest total number of shares bought.
Interpretation: This metric points out the most opportune day for volume buying. It suggests that historically, this day provided conditions that allowed for maximizing the quantity of shares purchased, potentially due to lower prices or other factors.
⚪ Best Price
Definition: Highlights the day of the month that offered the lowest average price per share, indicating both the day and the price.
How It Works: The script calculates the average price per share for each day and identifies the day with the lowest average.
Interpretation: This metric is key for investors looking to minimize costs. The best price day suggests that historically, buying on this day led to acquiring shares at a more favorable average price, potentially maximizing long-term investment returns.
⚪ Randomized Shares
Definition: This metric represents the total number of shares acquired on a randomly selected day of the month, simulated across the entire period.
How It Works: At the beginning of each month within the simulation, the script selects a random day when the market is open and calculates how many shares can be purchased with the available capital or monthly contribution at that day's opening price. This process is repeated each month, and the total number of shares acquired through these random purchases is tallied.
Interpretation: Randomized shares offer a comparison point to systematic buying strategies. By comparing the total shares acquired through random selection against those bought on the best or worst days, investors can gauge the impact of timing and market fluctuations on their investment strategy. A higher total in randomized shares might indicate that over the long term, the specific days chosen for investment might matter less than consistent market participation. Conversely, if systematic strategies yield significantly more shares, it suggests that timing could indeed play a crucial role in maximizing investment returns.
⚪ Randomized Price
Definition: The average price paid per share for the shares acquired on the randomly selected days throughout the simulation period.
How It Works: Each time shares are bought on a randomly chosen day, the script calculates the average price paid for all shares bought through this randomized strategy. This average price is updated as the simulation progresses, reflecting the cost efficiency of random buying decisions.
Interpretation: The randomized price metric helps investors understand the cost implications of a non-systematic, random investment approach. Comparing this average price to those achieved through more deliberate, systematic strategies can reveal whether consistent investment timing strategies outperform random investment actions in terms of cost efficiency. A lower randomized price suggests that random buying might not necessarily result in higher costs, while a higher average price indicates that systematic strategies might provide better control over investment costs.
█ How to Use
Traders can use this tool to analyze historical data and simulate different investment strategies. By inputting their initial capital, regular contribution amount, and start year, they can visually assess which days might have been more advantageous for buying, based on historical price actions. This can inform future investment decisions, especially for those employing dollar-cost averaging strategies or looking to optimize entry points.
█ Settings
StartYear: This setting allows the user to specify the starting year for the investment simulation. Changing this value will either extend or shorten the period over which the simulation is run. If a user increases the value, the simulation begins later and covers a shorter historical period; decreasing the value starts the simulation earlier, encompassing a longer time frame.
Capital: Determines the initial amount of capital with which the simulation begins. Increasing this value simulates starting with more capital, which can affect the number of shares that can be initially bought. Decreasing this value simulates starting with less capital.
Contribution: Sets the monthly financial contribution added to the investment within the simulation. A higher contribution increases the investment each month and could lead to more shares being purchased over time. Lowering the contribution decreases the monthly investment amount.
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
Backtest any Indicator v5Happy Trade,
here you get the opportunity to backtest any of your indicators like a strategy without converting them into a strategy. You can choose to go long or go short and detailed time filters. Further more you can set the take profit and stop loss, initial capital, quantity per trade and set the exchange fees. You get an overall result table and even a detailed, scroll-able table with all trades. In the Image 1 you see the provided info tables about all Trades and the Result Summary. Further more every trade is marked by a background color, Labels and Levels. An opening Label with the trade direction and trade number. A closing Label again with the trade number, the trades profit in % and the total amount of $ after all past trades. A green line for the take profit level and a red line for the stop loss.
Image 1
Example
For this description we choose the Stochastic RSI indicator from TradingView as it is. In Image 2 is shown the performance of it with decent settings.
Timeframe=45, BTCUSD, 2023-08-01 - 2023-10-20
Stoch RSI: k=30, d=40, RSI-length=140, stoch-length=140
Backtest any Indicator: input signal=Stoch RSI, goLong, take profit=9.1%, stop loss=2.5%, start capital=1000$, qty=5%, fee=0.1%, no Session Filter
Image 2
Usage
1) You need to know the name of the boolean (or integer) variable of your indicator which hold the buy condition. Lets say that this boolean variable is called BUY. If this BUY variable is not plotted on the chart you simply add the following code line at the end of your pine script.
For boolean (true/false) BUY variables use this:
plot(BUY ? 1:0,'Your buy condition hold in that variable BUY',display = display.data_window)
And in case your script's BUY variable is an integer or float then use instate the following code line:
plot(BUY ,'Your buy condition hold in that variable BUY',display = display.data_window)
2) Probably the name of this BUY variable in your indicator is not BUY. Simply replace in the code line above the BUY with the name of your script's trade condition variable.
3) Save your changed Indicator script.
4) Then add this 'Backtest any Indicator' script to the chart ...
5) and go to the settings of it. Choose under "Settings -> Buy Signal" your Indicator. So in the example above choose .
The form is usually: ' : BUY'. Then you see something like Image 2
6) Decide which trade direction the BUY signal should trigger. A go Long or a go Short by set the hook or not.
Now you have a backtest of your Indicator without converting it into a strategy. You may change the setting of your Indicator to the best results and setup the following strategy settings like Time- and Session Filter, Stop Loss, Take Profit etc. More of it below in the section Settings Menu.
Appereance
In the Image 2 you see on the right side the List of Trades . To scroll down you go into the settings again and decrease the scroll value. So you can see all trades that have happened before. In case there is an open trade you will find it at the last position of the list.
Every Long trade is green back grounded while Short trades are red.
Every trade begins with a label that show goLong or goShort and its number. And ends with another label again with its number, Profit in % and the resulting total amount of cash.
If activated you further see the Take Profit as a green line and the Stop Loss as a orange line. In the settings you can set their percentage above or below the entry price.
You also see the Result Summary below. Here you find the usual stats of a strategy of all closed trades. The profit after total amount of fees , amount of trades, Profit Factor and the total amount of fees .
Settings Menu
In the settings menu you will find the following high-lighted sections. Most of the settings have a question mark on their right side. Move over it with the cursor to read specific explanation.
Input Signal of your Indicator: Under Buy you set the trade signal of your Indicator. And under Target you set the value when a trade should happen. In the Example with the Stochastic RSI above we used 20. Below you can set the trade direction, let it be go short when hooked or go long when unhooked.
Trade Settings & List of Trades: Take Profit set the target price of any trade. Stop Loss set the price to step out when a trade goes the wrong direction. Check mark the List of Trades to see any single trade with their stats. In case that there are more trades as fits in the list you can scroll down the list by decrease the value Scroll .
Time Filter: You can set a Start Time or deactivate it by leave it unhooked. The same with End Time .
Session Filter: here you can choose to activate it on weekly base. Which days of the week should be trading and those without. And also on daily base from which time on and until trade are possible. Outside of all times and sessions there will be no new trades if activated.
Invest Settings: here you can choose the amount of cash to start with. The Quantity percentage define for every trade how much of the cash should be invested and the Fee percentage which have to be payed every trade. Open position and closing position.
Other Announcements
This Backtest script don't use the strategy functions of TradingView. It is programmed as an indicator. All trades get executed at candle closing. This script use the functionality "Indicator-on-Indicator" from TradingView.
Conclusion
So now it is your turn, take your promising indicators and connect it to that Backtest script. With it you get a fast impression of how successful your indicator will trade. You don't have to relay on coders who maybe add cheating code lines. Further more you can check with the Time Filter under which market condition you indicator perform the best or not so well. Also with the Session Filter you can sort out repeating good market conditions for your indicator. Even you can check with the GoShort XOR GoLong check mark the trade signals of you indicator in opposite trade direction with one click. And compare your indicators under the same conditions and get the results just after 2 clicks. Thanks to the in-build fee setting you get an impression how much a 0.1% fee cost you in total.
Cheers
Day/Week/Month Metrics (Zeiierman)█ Overview
The Day/Week/Month Metrics (Zeiierman) indicator is a powerful tool for traders looking to incorporate historical performance into their trading strategy. It computes statistical metrics related to the performance of a trading instrument on different time scales: daily, weekly, and monthly. Breaking down the performance into daily, weekly, and monthly metrics provides a granular view of the instrument's behavior.
The indicator requires the chart to be set on a daily timeframe.
█ Key Statistics
⚪ Day in month
The performance of financial markets can show variability across different days within a month. This phenomenon, often referred to as the "monthly effect" or "turn-of-the-month effect," suggests that certain days of the month, especially the first and last days, tend to exhibit higher than average returns in many stock markets around the world. This effect is attributed to various factors including payroll contributions, investment of monthly dividends, and psychological factors among traders and investors.
⚪ Edge
The Edge calculation identifies days within a month that consistently outperform the average monthly trading performance. It provides a statistical advantage by quantifying how often trading on these specific days yields better returns than the overall monthly average. This insight helps traders understand not just when returns might be higher, but also how reliable these patterns are over time. By focusing on days with a higher "Edge," traders can potentially increase their chances of success by aligning their strategies with historically more profitable days.
⚪ Month
Historically, the stock market has exhibited seasonal trends, with certain months showing distinct patterns of performance. One of the most well-documented patterns is the "Sell in May and go away" phenomenon, suggesting that the period from November to April has historically brought significantly stronger gains in many major stock indices compared to the period from May to October. This pattern highlights the potential impact of seasonal investor sentiment and activities on market performance.
⚪ Day in week
Various studies have identified the "day-of-the-week effect," where certain days of the week, particularly Monday and Friday, show different average returns compared to other weekdays. Historically, Mondays have been associated with lower or negative average returns in many markets, a phenomenon often linked to the settlement of trades from the previous week and negative news accumulation over the weekend. Fridays, on the other hand, might exhibit positive bias as investors adjust positions ahead of the weekend.
⚪ Week in month
The performance of markets can also vary within different weeks of the month, with some studies suggesting a "week of the month effect." Typically, the first and the last week of the month may show stronger performance compared to the middle weeks. This pattern can be influenced by factors such as the timing of economic reports, monthly investment flows, and options and futures expiration dates which tend to cluster around these periods, affecting investor behavior and market liquidity.
█ How It Works
⚪ Day in Month
For each day of the month (1-31), the script calculates the average percentage change between the opening and closing prices of a trading instrument. This metric helps identify which days have historically been more volatile or profitable.
It uses arrays to store the sum of percentage changes for each day and the total occurrences of each day to calculate the average percentage change.
⚪ Month
The script calculates the overall gain for each month (January-December) by comparing the closing price at the start of a month to the closing price at the end, expressed as a percentage. This metric offers insights into which months might offer better trading opportunities based on historical performance.
Monthly gains are tracked using arrays that store the sum of these gains for each month and the count of occurrences to calculate the average monthly gain.
⚪ Day in Week
Similar to the day in the month analysis, the script evaluates the average percentage change between the opening and closing prices for each day of the week (Monday-Sunday). This information can be used to assess which days of the week are typically more favorable for trading.
The script uses arrays to accumulate percentage changes and occurrences for each weekday, allowing for the calculation of average changes per day of the week.
⚪ Week in Month
The script assesses the performance of each week within a month, identifying the gain from the start to the end of each week, expressed as a percentage. This can help traders understand which weeks within a month may have historically presented better trading conditions.
It employs arrays to track the weekly gains and the number of weeks, using a counter to identify which week of the month it is (1-4), allowing for the calculation of average weekly gains.
█ How to Use
Traders can use this indicator to identify patterns or trends in the instrument's performance. For example, if a particular day of the week consistently shows a higher percentage of bullish closes, a trader might consider this in their strategy. Similarly, if certain months show stronger performance historically, this information could influence trading decisions.
Identifying High-Performance Days and Periods
Day in Month & Day in Week Analysis: By examining the average percentage change for each day of the month and week, traders can identify specific days that historically have shown higher volatility or profitability. This allows for targeted trading strategies, focusing on these high-performance days to maximize potential gains.
Month Analysis: Understanding which months have historically provided better returns enables traders to adjust their trading intensity or capital allocation in anticipation of seasonally stronger or weaker periods.
Week in Month Analysis: Identifying which weeks within a month have historically been more profitable can help traders plan their trades around these periods, potentially increasing their chances of success.
█ Settings
Enable or disable the types of statistics you want to display in the table.
Table Size: Users can select the size of the table displayed on the chart, ranging from "Tiny" to "Auto," which adjusts based on screen size.
Table Position: Users can choose the location of the table on the chart
-----------------
Disclaimer
The information contained in my Scripts/Indicators/Ideas/Algos/Systems does not constitute financial advice or a solicitation to buy or sell any securities of any type. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
My Scripts/Indicators/Ideas/Algos/Systems are only for educational purposes!
Stablecoin Dominance [LuxAlgo]The Stablecoin Dominance tool displays the evolution of the relative supply dominance of major stablecoins such as USDT, USDC, BUSD, DAI, and TUSD.
Users can disable supported stablecoins to only show the supply dominance relative to the ones enabled.
🔶 USAGE
The stablecoin space is subject to constant change due to new arriving stablecoins, regulation, collapse of coins...etc.
Studying the evolution in supply dominance can help see the effect that certain events can have on the stablecoin sphere.
This dominance graph is displayed over the user price chart to easily observe the correlation between stablecoin dominances and market prices. Users can still move the tool to a new pane below if having it on the price chart is not desired.
🔶 DETAILS
Supported stablecoins include:
Tether (USDT)
USD Coin (USDC)
Binance USD (BUSD)
Dai (DAI)
TrueUSD (TUSD)
Supply dominance of a stablecoin is calculated by dividing the total supply of that stablecoin by the total supply of all enabled stablecoins. That is for N stablecoins:
sd(stablecoin A) = supply(stablecoin 1) / [supply(stablecoin 1) + supply(stablecoin 2) + supply(stablecoin 3) + ... + supply(stablecoin N)
🔹 Display
Users can control the fill style of the displayed areas, with "Gradient" enabled by default. Using "Solid" will use a solid color for each area:
This can improve the performance of the script.
Selecting "None" will not display areas.
🔶 SETTINGS
Fill Style: Fill style of the areas between each returned supply dominance. "Gradient" will color the areas using a gradient, while "Solid" will use a solid color.
Stablecoins List: List of stablecoins used for the supply dominance calculation, disabling one stablecoin will exclude it from all calculations.
Blockunity Address Synthesis (BAS)Track the address status of the various cryptoassets and their evolution.
The Idea
The goal is to provide a simple tool for visualizing the evolution of different types of crypto addresses.
How to Use
This tool is to be used as fundamental information. It is not intended for investment or trading purposes.
Elements
Active Addresses
Active Addresses represent the subset of total addresses that made one or more on-chain transaction on a given day.
New Addresses
New Addresses refer to addresses that receive their first deposit in the selected crypto-asset.
Zero Balance Addresses
Zero Balance Addresses are addresses that transferred out (potentially sold) all of their holdings for the selected crypto-asset.
Total Addresses
Total Addresses refer to the overall count of unique addresses that have been created on a blockchain network.
Settings
In the settings, you can :
Adjust line smoothing (in terms of number of days).
Change the lookback period used to calculate the different variations.
Display or not the different address types (for better visualization, Total Addresses should be shown alone).
Show or hide labels and configure their offset.
Lastly, you can modify all table parameters.
Open Interest Inflows & Outflows [LuxAlgo]The Open Interest Inflows & Outflows indicator focuses on highlighting alterations in the overall count of active contracts associated with a specific financial instrument.
The indicator also includes an oscillator highlighting the price sentiment to use in conjunction with the open interest flow sentiment and also includes a rolling correlation of the open interest flow sentiment with a user-selected source.
🔶 USAGE
Open Interest (OI) indicates the total number of active contracts, encompassing both long and short positions, for a specific financial instrument at any given moment. This key indicator helps traders and analysts assess market activity and sentiment.
An increase in open interest generally indicates new money flowing into the market, suggesting increased activity and the potential for a trending market. Conversely, a decrease in open interest indicates that traders are closing their positions, suggesting less interest in that particular contract.
Open Interest Flow Sentiment assesses the correlation between the initiation of new positions (inflows) and the closure of existing positions (outflows) for a particular instrument. Positive values suggest a prevalence of inflows, while negative values signify a prevalence of outflows.
The magnitude of the deviation from zero reflects the extent of dominance, either in inflows or outflows.
Price Sentiment estimates the relationship between the strength of bulls (buyers) and bears (sellers) on an instrument. Positive values indicate higher bull power and negative values indicate higher bear power.
The correlation feature is a key component of the indicator and helps analyze the relationship between trading volume and Open Interest changes. If volume increases along with rising Open Interest, it supports the validity of the price trend.
A divergence between price movement, volume, and Open Interest may signal potential reversals.
🔶 DETAILS
This indicator, based on Dr. Alexander Elder's acclaimed Elder-Ray concept, aids traders in evaluating the strength of both bulls and bears by delving beneath the surface of the markets. It uncovers data not immediately apparent from a superficial glance at prices. The indicator comprises two components: Bull Power and Bear Power.
Considering that the high price of any candle signifies the maximum power of buyers and the low price represents the maximum power of sellers, Elder employs the 13-period Exponential Moving Average (EMA) to depict the average consensus of price value. Bull Power assesses whether buyers can drive prices above the average consensus of value, while Bear Power assesses whether sellers can push prices below this average.
Here are the formulas for Bull Power and Bear Power:
bull_power = high - ema(close, 13)
bear_power = low - ema(close, 13)
This concept is utilized to calculate Open Interest Flow Sentiment and Price Sentiment. The Open Interest Flow Sentiment estimates the relationship between new positions (inflows) and positions being closed (outflows), providing insights into market dynamics. The Price Sentiment, on the other hand, gauges the correlation between price movements and the Elder-Ray components, aiding traders in identifying potential shifts in market sentiment and momentum.
🔶 SETTINGS
🔹Open Interest Inflows & Outflows
OI Sentiment Correlation: toggles the visibility of Open Interest correlation with a variety of sources.
Money Flow Estimates: toggles the visibility of Money Flow Estimates calculated for the last bar.
🔹Style
OI Flow Sentiment: toggles the visibility of Open Interest Flow Sentiment, along with color customization options.
Price Sentiment: toggles the visibility of Price Sentiment, along with color customization options.
Correlation Colors: color customization option for the Correlation Area.
🔹Others
Smoothing: smoothing length applicable for Open Interest Flow Sentiment and Price Sentiment.
🔶 RELATED SCRIPTS
Open-Interest-Chart
Liquidation-Estimates
Thanks to our community for recommending this script. For more conceptual scripts and related content, we welcome you to explore by visiting >>> LuxAlgo-Scripts .
Exceptional MovementThis indicator is a simple tracker for exceptional movement.
It compares the range of the latest candle with the average daily range of the past 20 candles.
The option for the multiplier defines how big the current movement should be to be defined as exceptional movement.
Volatility Adjusted Profit Target
In my 'Volatility Adjusted Profit Target' indicator, I've crafted a dynamic tool for calculating target profit percentages suitable for both long and short trading strategies. It evaluates the highest and lowest prices over the anticipated duration of your trade, establishing a profit target that shifts with market volatility. As volatility increases, the potential for profit follows, with the target percentage rising accordingly; conversely, it declines with decreasing volatility. As a trader, setting an optimal Take Profit level has always been a challenge. This indicator not only helps in determining that level but also dynamically adjusts it throughout the trade's duration, providing a strategic edge in volatile markets.
Within Standard Deviation Bounds ProbabilityThis indicator calculates the probability of the closing price remaining within the upper and lower bounds defined by the mean and standard deviation of historical percent changes. It also plots the probability line and a horizontal line at 68%, which would be the expected probability for a normal distribution. It is designed to be used with my other indicator "Mean and Standard Deviation Lines.
Inputs:
period (Days): This defines the number of bars used to calculate the mean and standard deviation.
Calculations:
Percent change: Calculates the daily percentage change between closing prices.
Mean and standard deviation: Calculates the mean and standard deviation of the percent changes over the specified period.
Bounds: Calculates the upper and lower bounds by adding/subtracting the standard deviation from the mean, multiplied by the closing price.
Crossover tracking: Iterates through bars and counts crosses above and below the bounds.
Probability calculation: Calculates the total crossover probability as a percentage of the period.
Plotting: Plots the probability line and the horizontal line at 68%.
Limitations:
Assumes a normal distribution of price changes, which may not be accurate in real markets.
Overall:
This indicator provides a way to visualize the probability of the price staying within calculated bounds based on historical volatility. However, it's important to be aware of its limitations and interpret the results within the context of your trading strategy and risk management.
Interest Rate IndicatorThis script offers a overview of Year-over-Year (YoY) interest rates for key countries. The interest rate data utilized by default are sourced from TradingView Tickers, but they can be modified to any preferred source via the settings.
The script does not perform any calculations; its primary function is to present a comparative view of interest rates across different countries in a single indicator.
Key features include:
Interest rate data for the USA, European Union, Australia, Canada, Switzerland, Japan, United Kingdom, and New Zealand (Interest Rate Symbols are editable in the settings).
A table displaying country flags, names, and the latest interest rates, providing a clear and immediate comparison.
Country-representative colors for easy identification and visual distinction between different countries' data.
This indicator is designed for traders and analysts looking for a quick and efficient way to monitor and compare the interest rates of major economies directly within TradingView, facilitating better informed financial and investment decisions.
Mean and Standard Deviation Lines Description:
Calculates the mean and standard deviation of close-to-close price differences over a specified period, providing insights into price volatility and potential breakouts.
Manually calculates mean and standard deviation for a deeper understanding of statistical concepts.
Plots the mean line, upper bound (mean + standard deviation), and lower bound (mean - standard deviation) to visualize price behavior relative to these levels.
Highlights bars that cross the upper or lower bounds with green (above) or red (below) triangles for easy identification of potential breakouts or breakdowns.
Customizable period input allows for analysis of short-term or long-term volatility patterns.
Probability Interpretations based on Standard Deviation:
50% probability: mean or expected value
68% probability: Values within 1 standard deviation of the mean (mean ± stdev) represent roughly 68% of the data in a normal distribution. This implies that around 68% of closing prices in the past period fell within this range.
95% probability: Expanding to 2 standard deviations (mean ± 2*stdev) captures approximately 95% of the data. So, in theory, there's a 95% chance that future closing prices will fall within this wider range.
99.7% probability: Going further to 3 standard deviations (mean ± 3*stdev) encompasses nearly 99.7% of the data. However, these extreme values become less likely as you move further away from the mean.
Key Features:
Uses manual calculations for mean and standard deviation, providing a hands-on approach.
Excludes the current bar's close price from calculations for more accurate analysis of past data.
Ensures valid index usage for robust calculation logic.
Employs unbiased standard deviation calculation for better statistical validity.
Offers clear visual representation of mean and volatility bands.
Considerations:
Manual calculations might have a slight performance impact compared to built-in functions.
Not a perfect normal distribution: Financial markets often deviate from a perfect normal distribution. This means probability interpretations based on standard deviation shouldn't be taken as absolute truths.
Non-stationarity: Market conditions and price behavior can change over time, impacting the validity of past data as a future predictor.
Other factors: Many other factors influence price movements beyond just the mean and standard deviation.
Always consider other technical and fundamental factors when making trading decisions.
Potential Use Cases:
Identifying periods of high or low volatility.
Discovering potential breakout or breakdown opportunities.
Comparing volatility across different timeframes.
Complementing other technical indicators for confirmation.
Understanding statistical concepts for financial analysis.