Three Bars Play Strategy [JoseMetal]============
ENGLISH
============
- Description:
This strategy is based on two simple candlestick patterns (you can pick between 2 variants) with an extra option to require trigger candles to be opposite to the closing one (explained below).
There are several customizable settings such as take profit, stop loss and break even (all based on ATR).
You can customize starting and ending date for the testings.
Other options such as allow switch position if strategy SHORTs when you are LONG and vice versa.
There's an additional optional EMA filter.
- LONG / SHORT ENTRY:
Original pattern: for LONG, current candle must close ABOVE the HIGH of previous candle and the candle 3 positions back, opposite conditions for SHORT.
Variant pattern: for LONG, the current candle must close ABOVE the HIGH of the previous candle and the candle before that one too, opposite conditions for SHORT.
Optional: require the trigger candles to be opposite, ex: for LONG you need the previous candles to be RED (bearish).
Optional: EMA filter, price must be ABOVE for LONGs, below for SHORTs.
- EXIT CONDITION:
Stop Loss or Take Profit, based on ATR.
- Visual:
The script prints the Take Profit as a GREEN line, Stop Loss as a RED line and entry price with a WHITE line.
If enabled, the Break Even required price is BLUE, and the new Stop Loss level (for break even or protecting profit) is AQUA.
- Recommendations:
This strategy is great on DAILY on most assets, including crypto, forex and gold.
12H seems to work in most cases, lower timeframes are worse.
- Customization:
You can customize indicator settings (ATR, EMA...).
Stop Loss and Take Profit ATR multipliers are also customizable.
The break even is optional, required level and break even levels (also based on ATR) are custom too.
Almost everything is customizable, for colors and plotting styles check the "Style" tab.
Enjoy!
============
ESPAÑOL
============
- Descripción:
Ésta estrategia se basa en dos patrones simples de velas (puedes elegir entre 2 variantes) con una opción extra para requerir que las velas de activación sean opuestas a la de cierre (se explica más adelante).
Hay varios ajustes personalizables como el take profit, el stop loss y el break even (todos basados en el ATR).
Puedes personalizar la fecha de inicio y finalización de las pruebas.
Otras opciones como permitir el cambio de posición si la estrategia cambie a SHORT cuando está LONG y viceversa.
Hay un filtro de EMA opcional adicional.
- ENTRADA LARGA / CORTA:
Patrón original: para LONG, la vela actual debe cerrar POR ENCIMA del ALTO de la vela anterior y de la vela 3 posiciones atrás, condiciones opuestas para SHORT.
Patrón variante: para LONG, la vela actual debe cerrar POR ENCIMA del ALTO de la vela anterior y la vela anterior a esa también, condiciones opuestas para SHORT.
Opcional: requiere que las velas de activación sean opuestas, por ejemplo: para LONG requiere que las velas anteriores sean ROJAS (bajistas).
Opcional: fltro EMA, el precio debe estar POR ENCIMA para los LONGs, por debajo para los SHORTs.
- CONDICIÓN DE SALIDA:
Stop Loss o Take Profit, basado en el ATR.
- Visual:
El script dibuja el Take Profit como una línea VERDE, el Stop Loss como una línea ROJA y el precio de entrada con una línea BLANCA.
Si está habilitado, el precio de break even requerido es AZUL, y el nuevo nivel de Stop Loss (para el break even o asegurar ganancias) es CELESTE.
- Recomendaciones:
Ésta estrategia es estupenda en DIARIO en la mayoría de los activos, incluyendo criptos, fórex y oro.
En 12H parece funcionar en la mayoría de los casos, las temporalidades inferiores son peores.
- Personalización:
Puedes personalizar la configuración de los indicadores (ATR, EMA...).
Los multiplicadores de Stop Loss y Take Profit ATR también son personalizables.
El break even es opcional, el nivel requerido y los niveles de break even (también basados en ATR) son personalizables también.
Casi todo es personalizable, para los colores y estilos de trazado compruebe la pestaña "Estilo".
¡Que lo disfrutes!
Statistics
ATR / Volatility / Leverage [JoseMetal]============
ENGLISH
============
- Description:
This is a utility indicator, it prints a table with ATR for 3 custom timeframes, using the ATR of basis, it calculates volatility (%) and a recommended leverage depending on your risk settings.
I use this tool to determine the leverage for each asset and keep the same risk management for all of them.
- Visual:
It shows a table with ATR, volatility and leverage for 3 timeframes.
For each timeframe it also prints 2 periods, short and long, also customizable, so you can determine the range.
- Customization:
You can customize up to 3 different timeframes, ATR short and long length, as well as a multiplier.
There's a risk setting that you should tweak depending on your way to trade.
Everything else customizable (as usual in my scripts), colors, indicator settings etc.
- Usage and recommendations:
Default settings are my own, feel free to tweak them as you wish, i usually trade on 4H using 1-2% of my account balance per trade with low leverage, so you probably want to increase the risk setting, that's also extremely recommended if you trade forex and metals, because i trade crypto mainly.
Enjoy!
============
ESPAÑOL
============
- Descripción:
Este es un indicador de utilidad, muestra una tabla con ATR para 3 temporalidades personalizables, usando el ATR de base, calcula la volatilidad (%) y un apalancamiento recomendado dependiendo de tu configuración de riesgo.
Yo uso ésta herramienta para determinar el apalancamiento para cada activo y mantener la misma gestión de riesgo para todos ellos (no tiene sentido ir a 5x en BTC y 5x en ORO, por ejemplo... ésta utilidad resuelve ese problema).
- Visual:
Muestra una tabla con el ATR, la volatilidad y el apalancamiento para 3 temporalidades.
Para cada temporalidad también muestra 2 rangos de periodización, corto y largo, también personalizables, para que puedas determinar de un vistazo los rangos en los que se mueve.
- Personalización:
Puedes personalizar hasta 3 temporalidades diferentes, la longitud del ATR corto y largo, así como un multiplicador.
Hay un ajuste de riesgo que debes ajustar dependiendo de tu forma de operar.
Todo lo demás es personalizable (como es habitual en mis scripts), colores, configuración de los indicadores, etc.
- Uso y recomendaciones:
Los ajustes por defecto son los míos, siéntete libre de ajustarlos como desees, yo suelo operar en 4H utilizando el 1-2% del saldo de mi cuenta por operación con un bajo apalancamiento, por lo que probablemente quieras aumentar el ajuste de riesgo, eso también es muy recomendable si operas fórex y metales, porque yo tradeo cripto principalmente.
¡Que lo disfrutes!
ILM CFTC COT Legacy PlotUse this indicator on Daily Timeframe
Please refer to the below link for CFTC Disaggregated COT
www.cftc.gov
This script is very similar to COT Financial Plot indicator except that it plots the data for Futures in Legacy buckets Commercial vs. Non-Commercial
Volume percentrank[TV1]Volume percentrank
Volume normalized by percentile.
The indicator calculates the percentile of the trading volume . The volume in the base asset or quote asset can be selected as data. To calculate the volume of a quoted asset, the closing price or another standard method for calculating the price of a bar can be used.
A feature of percentile calculation with a small data sample length is low accuracy. Despite the fact that the script allows you to calculate a percentile with a length of 1, using a percentile length less than 100 is not recommended.
The percentile calculation method does not allow correctly calculating the percentile at the beginning of the chart due to the lack of all data in the selection, therefore, when the date of the first bar changes (this happens on small timeframes if the TradingView subscription does not allow you to see all historical data), the indicator will be repainted up to the bar number equal to the percentile sample length.
Huge values of the percentile length may cause a script error. If the indicator doesn't work, just make the percentile length smaller.
Объем, нормализованный по процентилью.
Индикатор вычисляет процентиль объема торгов. В качестве данных может быть выбран объем в базовом(base) активе или котировочном(quote) активе. Для расчета объема в котировочном активе может использоваться цена закрытия либо другой стандартный метод расчета цены бара.
Особенностью расчета процентиля при малой длине выборки данных является малая точность. Не смотря на то, что скрипт позволяет вычиcлить процентиль с длинной 1, использовать длину процентиля меньше 100 не рекомендуется.
Метод расчета процентиля не позволяет корректно рассчитать процентиль в начале графика из-за отсутствия всех данных в выборке, поэтому при изменении даты первого бара (это происходит на малых таймфреймах, если подписка TradingView не позволяет видеть все исторические данные) индикатор подвержен перерисовке вплоть до номера бара равного длине выборки процентиля.
Большие значения длины процентиля могут приводить к ошибке скрипта. Если индикатор не работает, просто сделайте длину процентиля меньше.
Hurst Exponent (Dubuc's variation method)Library "Hurst"
hurst(length, samples, hi, lo)
Estimate the Hurst Exponent using Dubuc's variation method
Parameters:
length : The length of the history window to use. Large values do not cause lag.
samples : The number of scale samples to take within the window. These samples are then used for regression. The minimum value is 2 but 3+ is recommended. Large values give more accurate results but suffer from a performance penalty.
hi : The high value of the series to analyze.
lo : The low value of the series to analyze.
The Hurst Exponent is a measure of fractal dimension, and in the context of time series it may be interpreted as indicating a mean-reverting market if the value is below 0.5 or a trending market if the value is above 0.5. A value of exactly 0.5 corresponds to a random walk.
There are many definitions of fractal dimension and many methods for its estimation. Approaches relying on calculation of an area, such as the Box Counting Method, are inappropriate for time series data, because the units of the x-axis (time) do match the units of the y-axis (price). Other approaches such as Detrended Fluctuation Analysis are useful for nonstationary time series but are not exactly equivalent to the Hurst Exponent.
This library implements Dubuc's variation method for estimating the Hurst Exponent. The technique is insensitive to x-axis units and is therefore useful for time series. It will give slightly different results to DFA, and the two methods should be compared to see which estimator fits your trading objectives best.
Original Paper:
Dubuc B, Quiniou JF, Roques-Carmes C, Tricot C. Evaluating the fractal dimension of profiles. Physical Review A. 1989;39(3):1500-1512. DOI: 10.1103/PhysRevA.39.1500
Review of various Hurst Exponent estimators for time-series data, including Dubuc's method:
www.intechopen.com
Lagging Session Regression ChannelHello Traders !
Note :
This is my very first published script on trading view & from brainstorming an idea to developing to the finched product it was imperative to me for the indiactor and every one of its features to be of some meaningfull use. If you like the idea of statsitics being able to predict future prices in the market then this indicator may be usefull in your trading arsenal.
Introduction :
Lagging Session Regression Channel (LSRC) is a statistical trend analysis indicator that "laggs" the market by the user defined session, by defualt a day, by doing so the indicator leverges the ability of simple linear regression to predict future asset price.(This can be used on any asset in any market in any time frame)
Options & inputs :
- Bar regression lookback :
The value of bars back from the lats session change, if the seesion time is equivelnt to the the chart timefrmae then the regression line will not lag price, i.e it will act as a stantdard lineer regression channel chnaging on evrey last confimred bar.
- Standard Deviation lookback :
The value of bars from the last session change to cacluate the unbiased standard deviation, The lookback can be set to > or < the regression lookback to cauture > or < less asset volatility. (note this is the same as the residual standard deviation)
- Predicted price at nth bar :
if you whant to know the predicted close price value at any given point in the regression and to the RHS of the regression.
- Regression Line colors group :
Changes the colors of each plotted line.
- OLS Line color : is only changeable when trend color is set to false / unticked.
- Visable deviations group :
Plots the lines that you want on chart, e.g if "Show DEV1" and "Sow DEV SUB1" are the only inputs ticked then they will be the only lines ploted along with the simple linear regression line.
- Regression Line Dynamics group :
All inputs in this group change the regressions calculations given the bar lookback is constant / the same.
- Trend color : if set too true, when the close of the proceding real time bar is greater than the simple linear regression line from the last confimred session the line will be colored green, if otherwise the close is below the simple linear regression line the line will be colored red.
- Extend regression line :
This is the same chart image as seen on the publication chart image but with Extend regression line set to true, this allows the trader to test the valdity of the regression and how well it predicts future price, as seen on the M15 chart of BTCUSD above the indicator was pritty good at doing this.
- Standard deviation channel source :
Source for standard deviation to be calculated on. note if this is set to a varible other than the close then this will no longer be the resdiaul standard deviation, as of now "LSRC 1.0" the regression uses only the close for y / predicted values.
- Time elasped unitl next regression calculation :
The session time until the next LSRC will be calculated and plotted
Label LSRC stats :
- STAN DEV : the standard deviation used to cacluateed the deviation channels
- MIN : The lowest price across the regression
- MAX : The highest price across the regression
- n bars above dev 1 : The number of bars that closed above the first standard deviation channel across the entire regression calculation
- n bars below sub dev1 : The number of bars that closed below the first standard deviation channel.
- Regression Price : The output of "Predicted price at nth bar" input.
Hope you find this usefull !
I will continue too try improve this script and update it accordingly.
ILM CFTC COT Disaggregated PlotUse this indicator on Daily Timeframe
Please refer to the below link for CFTC Disaggregated COT
www.cftc.gov
This script is very similar to COT Financial Plot indicator except that it plots the data for Disaggregated Futures
NetLiquidityLibraryLibrary "NetLiquidityLibrary"
The Net Liquidity Library provides daily values for net liquidity. Net liquidity is measured as Fed Balance Sheet - Treasury General Account - Reverse Repo. Time series for each individual component included too.
get_net_liquidity_for_date(t)
Function takes date in timestamp form and returns the Net Liquidity value for that date. If date is not present, 0 is returned.
Parameters:
t : The timestamp of the date you are requesting the Net Liquidity value for.
Returns: The Net Liquidity value for the specified date.
get_net_liquidity()
Gets the Net Liquidity time series from Dec. 2021 to current. Dates that are not present are represented as 0.
Returns: The Net Liquidity time series.
Beta ScreenerThis script allows you to screen up to 38 symbols for their beta. It also allows you to compare the list to not only SPY but also CRYPTO10! Features include custom time frame and custom colors.
Here is a refresher on what beta is:
Beta (β) is a measure of the volatility—or systematic risk—of a security or portfolio compared to the market as a whole (usually the S&P 500 ). Stocks with betas higher than 1.0 can be interpreted as more volatile than the S&P 500 .
Beta is used in the capital asset pricing model (CAPM), which describes the relationship between systematic risk and expected return for assets (usually stocks). CAPM is widely used as a method for pricing risky securities and for generating estimates of the expected returns of assets, considering both the risk of those assets and the cost of capital.
How Beta Works
A beta coefficient can measure the volatility of an individual stock compared to the systematic risk of the entire market. In statistical terms, beta represents the slope of the line through a regression of data points. In finance, each of these data points represents an individual stock's returns against those of the market as a whole.
Beta effectively describes the activity of a security's returns as it responds to swings in the market. A security's beta is calculated by dividing the product of the covariance of the security's returns and the market's returns by the variance of the market's returns over a specified period.
cov (a,b)/var(b)
ReduceSecurityCallsLibrary "ReduceSecurityCalls"
This library allows you to reduce the number of request.security calls to 1 per symbol per timeframe. Script provides example how to use it with request.security and possible optimisation applied to htf data call.
This data can be used to calculate everything you need and more than that (for example you can calculate 4 emas with one function call on mat_out).
ParseSource(mat_outs, o)
Should be used inside request.security call. Optimise your calls using timeframe.change when htf data parsing! Supports up to 5 expressions (results of expressions must be float or int)
Parameters:
mat_outs : Matrix to be used as outputs, first value is newest
o : Please use parametres in the order they specified (o should be 1st, h should be 2nd etc..)
Returns: outs array, due to weird limitations do not try this :matrix_out = matrix.copy(ParseSource)
TMR Illiquidity Index
This index is a composition of all major market liquidity factors including:
- Volatility (Interest Rates, Bonds, and Equities)
- Federal Reserve Balance Sheet
- Dollar Demand
This indicator compiles the major concepts of liquidity that Market Radar illustrated over time into one easy-to-understand chart. This index measures how "illiquid" conditions are. The higher the index goes the more illiquid + volatile the market tends to behave, vis vera the lower it goes. This index allows us to evaluate the current health of market conditions and gain a visual on how liquidity is represented in the market.
Historically, we've noticed a very strong correlation between this index and the VIX, so another point to reference is where this index is in relation to the VIX.
Use the link below to obtain access to this indicator. Thank you
*This is not an indicator that claims any realized returns or an emphasis on potential returns. Any returns achieved with this strategy are not guaranteed and should not be indicative of future results. Nor should this be used as the sole decision prior to making an investment or as investment advice*
GDP BreakdownProvides an easy way for viewing the sub sections that make up a country's total GDP. Not all countries provide data for each subsector (Agriculture, Construction, Manufacturing, Mining, Public Administration, Services, Utilities). Only countries that provide complete data are able to be selected in the settings. If I've missed any please let me know in the comment section so they can be added. This is much easier than having to individually selecting each ticker for each country when looking to compare how diversified an economy is.
Candlestick Stats [tanayroy]The script detects candlestick patterns and stats related to the pattern. We have included 44 candle patterns to select. You can get stats for any timeframe and holding period. If a particular pattern is not available, the script will give an error.
What is available.
You can view the composite stat in the table panel.
Pattern Name: The pattern name
Pattern Type: Bullish(🟢)/ Bearish(🔴)/ Neutral(⚫)
Total Found: Number of time pattern appeared in the chart
Success: Number of time the pattern generated a positive return
Failure: Number of time the pattern generated a negative return
Highest Return: Highest return generated by the pattern (assuming trade taken at the open of the next candle and closed at the close price of the last candle of the holding period).
Lowest Return: Lowest return generated by the pattern
Average return: Average return generated by the pattern
Total Up Breakout: Number of time patterns take an up breakout(break above high).
Max Up Movement: Maximum up movement recorded by the pattern(distance between pattern high and highest high candle in given holding period).
Min up movement: Minimum up movement recorded by the pattern.
Average Up Movement: Average up movement recorded
Total Down Breakout: Number of time patterns take a down breakout(break below low).
Max Down Movement: Maximum down movement recorded by the pattern.
Min Down Movement: Minimum down movement recorded by the pattern.
Average Down Movement: Average down movement recorded
You can find the number of bars tested, start date and end date in the panel.
You can visually inspect the candle pattern performance in the chart.
Available options:
Detect Trend: You can detect trends based on SMA 50, SMA 50/200 or No detection.
Stat bars: Holding period after detecting the pattern.
Panel position: Position the stat table as per your choice.
Select pattern: Select available predefined pattern.
Label color: choose color according to your choice.
Sw1tchFX - Average Daily RangeDESCRIPTION AND OVERVIEW
The Average Daily Range is a measure of volatility (typically across 5 days for the FX markets). I originally saw this being used in a trading system called ANTSSYS by Daryll Guppy and some other developers. I couldn't find it anywhere so I decided to build it from scratch.
What this does is allow you to measure volatility across various FX assets (I will apply other asset classes in the future that this is applicable to i.e. Crypto, Commodities, Blue Chip Stocks), and set realistic targets based off that volatility. Overall, this makes much more sense to me in the FX markets rather than support and resistance lines because it's based off the actual movement of the asset class. Market research shows that an asset class has a 80-85% chance to reach 75% of it's Average Daily Range (ADR).
Let's take a look at the daily ADR on the GBPNZD 15m chart. Notice how the values of the ADR act as real support and resistance based off the volatility of the asset. In this case, price did not quite reach the 75% ADR target.
Let's take a look at another example on EURNZD 15m chart. In this case, price hit the 75% target.
It's important to note that these levels do not bound the price. The probability that price exceeds it's 75% ADR is fairly low, but not impossible. Especially during important news events. Let's look at the recent USDCHF 15m chart for example.
Additionally, you can use these values to measure longer term movements (Weekly, and Monthly)
Here is a weekly view:
And a monthly view:
HOW I USE THIS
I use this in conjunction with some other indicators I've developed. Typically, I use range bars since I only care about price, not time. Additionally, averages are smoother when time is not taken into account and only price.
HOW THE CANDLE OPEN AND CLOSE IS CALCULATED
This is done based off of your own specific time zone and from the daily candle. So for me on PST, the daily candle will close at 1400. Once closed, a new ADR value is automatically calculated and added to the chart. There is an option to show past ADR values if you would like to see them or conduct additional research.
Multi-Panel: Trade-Volatility-Probability [Loxx]Multi-Panel: Trade-Volatility-Probability shows user selected and volatility-based price levels and probabilities on the chart. This is useful for both options and all styles of up/down trading methods that rely on volatility.
Trading Panel: Shows trading information to take profits and stop-loss based on multiples of volatility. Also shows equity inputs by the user to calculate optimal position size
Key things to note about the Trading Panel
-Trade side: Long or short. you change this this to change the take profit and SL levels in displayed on the table to be used w/ up/down trading styles that rely on volatility stops
-Account size: User enters total balance available for trade
-Risk: Total % of account size you're willing to lose should the SL be hit
-Position size: Size of the position given the SL and your preferred Risk
-Take profit/Stop loss levels: Based on multipliers selected by the user in settings. These shouldn't be changed unless you really know what you're doing with volatility stops
-Entry: Source price. can be 1 of 37 different prices. See Loxx's Expanded Source Types:
Volatility Panel: Shows information about the volatility the user selected to be used to take profit/stop-loss/range calculations. Volatility types included are:
Close-to-Close
Close-to-Close volatility is a classic and most commonly used volatility measure, sometimes referred to as historical volatility .
Volatility is an indicator of the speed of a stock price change. A stock with high volatility is one where the price changes rapidly and with a bigger amplitude. The more volatile a stock is, the riskier it is.
Close-to-close historical volatility calculated using only stock's closing prices. It is the simplest volatility estimator. But in many cases, it is not precise enough. Stock prices could jump considerably during a trading session, and return to the open value at the end. That means that a big amount of price information is not taken into account by close-to-close volatility .
Despite its drawbacks, Close-to-Close volatility is still useful in cases where the instrument doesn't have intraday prices. For example, mutual funds calculate their net asset values daily or weekly, and thus their prices are not suitable for more sophisticated volatility estimators.
Parkinson
Parkinson volatility is a volatility measure that uses the stock’s high and low price of the day.
The main difference between regular volatility and Parkinson volatility is that the latter uses high and low prices for a day, rather than only the closing price. That is useful as close to close prices could show little difference while large price movements could have happened during the day. Thus Parkinson's volatility is considered to be more precise and requires less data for calculation than the close-close volatility.
One drawback of this estimator is that it doesn't take into account price movements after market close. Hence it systematically undervalues volatility. That drawback is taken into account in the Garman-Klass's volatility estimator.
Garman-Klass
Garman Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing price. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change is a process of continuous diffusion (geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremums.
Researchers Rogers and Satchel have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Rogers-Satchell
Rogers-Satchell is an estimator for measuring the volatility of securities with an average return not equal to zero.
Unlike Parkinson and Garman-Klass estimators, Rogers-Satchell incorporates drift term (mean return not equal to zero). As a result, it provides a better volatility estimation when the underlying is trending.
The main disadvantage of this method is that it does not take into account price movements between trading sessions. It means an underestimation of volatility since price jumps periodically occur in the market precisely at the moments between sessions.
A more comprehensive estimator that also considers the gaps between sessions was developed based on the Rogers-Satchel formula in the 2000s by Yang-Zhang. See Yang Zhang Volatility for more detail.
Yang-Zhang
Yang Zhang is a historical volatility estimator that handles both opening jumps and the drift and has a minimum estimation error.
We can think of the Yang-Zhang volatility as the combination of the overnight (close-to-open volatility ) and a weighted average of the Rogers-Satchell volatility and the day’s open-to-close volatility . It considered being 14 times more efficient than the close-to-close estimator.
Garman-Klass-Yang-Zhang
Garman Klass is a volatility estimator that incorporates open, low, high, and close prices of a security.
Garman-Klass volatility extends Parkinson's volatility by taking into account the opening and closing price. As markets are most active during the opening and closing of a trading session, it makes volatility estimation more accurate.
Garman and Klass also assumed that the process of price change is a process of continuous diffusion (geometric Brownian motion). However, this assumption has several drawbacks. The method is not robust for opening jumps in price and trend movements.
Despite its drawbacks, the Garman-Klass estimator is still more effective than the basic formula since it takes into account not only the price at the beginning and end of the time interval but also intraday price extremums.
Researchers Rogers and Satchel have proposed a more efficient method for assessing historical volatility that takes into account price trends. See Rogers-Satchell Volatility for more detail.
Exponential Weighted Moving Average
The Exponentially Weighted Moving Average (EWMA) is a quantitative or statistical measure used to model or describe a time series. The EWMA is widely used in finance, the main applications being technical analysis and volatility modeling.
The moving average is designed as such that older observations are given lower weights. The weights fall exponentially as the data point gets older – hence the name exponentially weighted.
The only decision a user of the EWMA must make is the parameter lambda. The parameter decides how important the current observation is in the calculation of the EWMA. The higher the value of lambda, the more closely the EWMA tracks the original time series.
Standard Deviation of Log Returns
This is the simplest calculation of volatility . It's the standard deviation of ln(close/close(1))
Pseudo GARCH(2,2)
This is calculated using a short- and long-run mean of variance multiplied by θ.
θavg(var ;M) + (1 − θ) avg (var ;N) = 2θvar/(M+1-(M-1)L) + 2(1-θ)var/(M+1-(M-1)L)
Solving for θ can be done by minimizing the mean squared error of estimation; that is, regressing L^-1var - avg (var; N) against avg (var; M) - avg (var; N) and using the resulting beta estimate as θ.
Average True Range
The average true range (ATR) is a technical analysis indicator, introduced by market technician J. Welles Wilder Jr. in his book New Concepts in Technical Trading Systems, that measures market volatility by decomposing the entire range of an asset price for that period.
The true range indicator is taken as the greatest of the following: current high less the current low; the absolute value of the current high less the previous close; and the absolute value of the current low less the previous close. The ATR is then a moving average, generally using 14 days, of the true ranges.
True Range Double
A special case of ATR that attempts to correct for volatility skew.
Chi-squared Confidence Interval:
Confidence interval of volatility is calculated using an inverse CDF of a Chi-Squared Distribution. You can change the volatility input used to either realized, upper confidence interval, or lower confidence interval. This is included in case you'd like to see how far price can extend if volatility hits it's upper or lower confidence levels. Generally, you'd just used realized volatility, so I wouldn't change this setting.
Inverse CDF of a Chi-Squared Distribution
The chi-square distribution is a one-parameter family of curves. The parameter ν is the degrees of freedom.
The icdf of the chi-square distribution is
x=F^−1(p∣ν) = {x:F(x∣ν) = p}
where
p=F(x∣ν)= ∫ (t^(v-2)/2 * e^t/2) / (2^(v/2) / Γ(v/2))
ν is the degrees of freedom, and Γ( · ) is the Gamma function. The result p is the probability that a single observation from the chi-square distribution with ν degrees of freedom falls in the interval .
Additional notes on Volatility Panel
-Shows both current timeframe volatility per candle at whatever date backward you select
-Shows annualized volatility basaed on selected days per year and per bar volatility; this is automaitcally caulculated no matter the timeframe used. This means that it'll calculate annualized volatility for the current candle even on the 1 second timeframe. Days per year should be 252 for everything but cryptocurrency; however, for all types of tradable assets, anything over the 3 day timeframe will calculate on 365 days.
Probability Panel
This panel shows the probability levels of a user selected upper and lower price boundary. This includes the inside range of volatility between the lower and upper price levels and the outside probability below the lower price level and above the upper price level. These values are calculated using the CDF (cumulative density function) of a normal distribution. In simpler terms, CDF returns area under a bell curve between two points left and right, or for our purposes, high and low. This yeilds the probabilities you see in the Probability Panel. See the following graphic to visualize how this works:
The red line is the entry bar; the yellow line is the "mean" but in this case just the chosen source price.
Other things to know
You can turn on/off all labels and levels and fills
Strategy weekly results as numbers v1This script is based on an idea of monthly statistics that have been found across tradingview community scripts. This is an improved version with weekly results with the ability to define the size of every group (number of weeks within one group).
Initial setup of the strategy
1. Set the period to calculate the results between.
2. Set the statistic precision and group size.
3. Enable "Recalculate" → "On every tick" under the strategy "Properties" section.
The logic under the hood
1. Get the period between which to calculate the strategy.
2. Calculate the first day of the first week within the period.
3. Calculate the latest day of the latest week within the period.
4. Calculate the results of the selected period.
5. Group the values by the defined number of cells.
6. Calculate the summary of every group.
7. Render the table.
Please, be careful . To use this tool you will need to enable the "Recalculate" → "On every tick" option but it means that your strategy will be executed on every tick instead of bar close. It can cause unexpected results in your strategy behaviour.
go pro 12TPFX
The foreign exchange market is open to bid and double auction. Understanding its microstructure:
The combination of (decentralized direct market and semi-centralized market)
intermediary market). Part of understanding market structure around the operational efficiency of the foreign market, (daily trading volume, 95% of spot volume involves transfer of funds between market makers)
The inefficiency of the market responds to the experimental technology used, which is Henw? The design of the mechanism means the market's response to these variables --
The number of the market maker or market maker, (because the spread is quoted saying, market makers when they are outside the market does not lead to market inactivity, we have many financial institutions, not only in the presence of makers), the number of brokers, and the number of clients (the number, explain it, what do I mean by it) - In the sense of liquidity, it reflects a sufficient number of participants. What does it mean? It means the carrying capacity of the market (Is the market considered a carrier of an infinite number?” Think about it
We continue on the issue of liquidity; see the banks seeking (larger shares that want to increase their market share, according to) the rationale that if the bank is capable of a large share, it will still be able to earn money from the large volume of transactions to obtain a larger share in the market
# Here small banks are getting more pressure to withdraw from the market. Note: Small banks do not participate in market making, but mostly they work as clients for large banks, at the same time, permits mention an influence, but not huge. The greater the volume of flows that the bank takes, the greater the bank's ability to Leverage, i.e. the use of “client flow information, to inform the taking of positions (with shrewdness and intelligence), the microstructure. A practical study and the results of the exchange of assets within a specific framework of rules - economic summaries, because news or information causes the flow of traders and how the presence of more than one liquidity provider affects More in one market environment on the process of changing the price adjustment, taking advantage of
Opening Range with Infinite Price TargetsOpening Range with Infinite Price Targets is an ORB indicator that automatically generates price targets into infinity based on a user-defined % of range.
This indicator includes many nice-to-have features missing from other indicators. Such as:
Price Target Labels with Price tooltip, want to know exactly what price pt3 is at? Hover over it and see.
Custom Defined Range time, Set your Range Start and end time to whatever you need, Doesn't have to be pinned to opening range!. Note: Time is in chart time.
Historical View (Default off), Tired of your chart looking messy with a ton of lines from historical data? No problem! You can choose to view or not view historical data.
Alerts for Range Breaks, First Range Breaks, and Discovery Price Target hits. As well as Exported Values for Range High, Low, and Mean to set your own alerts from custom sources.
Custom Price Targets, set your price targets to a % of the range based on your own strategy.
Last but not Least, Infinitely Generating Price Targets. They just keep building. New Targets will be generated when the price closes above/below the current farthest target.
Enjoy!
kNNLibrary "kNN"
Collection of experimental kNN functions. This is a work in progress, an improvement upon my original kNN script:
The script can be recreated with this library. Unlike the original script, that used multiple arrays, this has been reworked with the new Pine Script matrix features.
To make a kNN prediction, the following data should be supplied to the wrapper:
kNN : filter type. Right now either Binary or Percent . Binary works like in the original script: the system stores whether the price has increased (+1) or decreased (-1) since the previous knnStore event (called when either long or short condition is supplied). Percent works the same, but the values stored are the difference of prices in percents. That way larger differences in prices would give higher scores.
k : number k. This is how many nearest neighbors are to be selected (and summed up to get the result).
skew : kNN minimum difference. Normally, the prediction is done with a simple majority of the neighbor votes. If skew is given, then more than a simple majority is needed for a prediction. This also means that there are inputs for which no prediction would be given (if the majority votes are between -skew and +skew). Note that in Percent mode more profitable trades will have higher voting power.
depth : kNN matrix size limit. Originally, the whole available history of trades was used to make a prediction. This not only requires more computational power, but also neglects the fact that the market conditions are changing. This setting restricts the memory matrix to a finite number of past trades.
price : price series
long : long condition. True if the long conditions are met, but filters are not yet applied. For example, in my original script, trades are only made on crossings of fast and slow MAs. So, whenever it is possible to go long, this value is set true. False otherwise.
short : short condition. Same as long , but for short condition.
store : whether the inputs should be stored. Additional filters may be applied to prevent bad trades (for example, trend-based filters), so if you only need to consult kNN without storing the trade, this should be set to false.
feature1 : current value of feature 1. A feature in this case is some kind of data derived from the price. Different features may be used to analyse the price series. For example, oscillator values. Not all of them may be used for kNN prediction. As the current kNN implementation is 2-dimensional, only two features can be used.
feature2 : current value of feature 2.
The wrapper returns a tuple: [ longOK, shortOK ]. This is a pair of filters. When longOK is true, then kNN predicts a long trade may be taken. When shortOK is true, then kNN predicts a short trade may be taken. The kNN filters are returned whenever long or short conditions are met. The trade is supposed to happen when long or short conditions are met and when the kNN filter for the desired direction is true.
Exported functions :
knnStore(knn, p1, p2, src, maxrows)
Store the previous trade; buffer the current one until results are in. Results are binary: up/down
Parameters:
knn : knn matrix
p1 : feature 1 value
p2 : feature 2 value
src : current price
maxrows : limit the matrix size to this number of rows (0 of no limit)
Returns: modified knn matrix
knnStorePercent(knn, p1, p2, src, maxrows)
Store the previous trade; buffer the current one until results are in. Results are in percents
Parameters:
knn : knn matrix
p1 : feature 1 value
p2 : feature 2 value
src : current price
maxrows : limit the matrix size to this number of rows (0 of no limit)
Returns: modified knn matrix
knnGet(distance, result)
Get neighbours by getting k results with the smallest distances
Parameters:
distance : distance array
result : result array
Returns: array slice of k results
knnDistance(knn, p1, p2)
Create a distance array from the two given parameters
Parameters:
knn : knn matrix
p1 : feature 1 value
p2 : feature 2 value
Returns: distance array
knnSum(knn, p1, p2, k)
Make a prediction, finding k nearest neighbours and summing them up
Parameters:
knn : knn matrix
p1 : feature 1 value
p2 : feature 2 value
k : sum k nearest neighbors
Returns: sum of k nearest neighbors
doKNN(kNN, k, skew, depth, price, long, short, store, feature1, feature2)
execute kNN filter
Parameters:
kNN : filter type
k : number k
skew : kNN minimum difference
depth : kNN matrix size limit
price : series
long : long condition
short : short condition
store : store the supplied features (if false, only checks the results without storage)
feature1 : feature 1 value
feature2 : feature 2 value
Returns: filter output
Percent Change Bar ChartThis script shows the percentage with regards the previous bars, the loopback period can be changed (default 1), so you can see how it fluctuates with the market and volatility
Times-Revenue (Fundamental Metric)Times-revenue is calculated by dividing the selling price of a company by the prior 12 months revenue of the company. The result indicates how many times of annual income a buyer was willing to pay for a company.
In color Red: it shows the last annual metric calculated
In color Gray: it shows the last 4 quarters annualized results
Quantitative Backtesting Panel + ROI Table - ShortsThis script is an aggregate of a backtesting panel with quantitative metrics, ROI table and open ROI reader. It also contains a mechanism for having a fixed percentage stop loss, similar to native TV backtester. For shorts only.
Backtesting Panel:
- Certain metrics are color coded, with green being good performance, orange being neutral, red being undesirable.
• ROI : return with the system, in %
• ROI(COMP=1): return if money is compounded at a rate of 100%
• Hit rate: accuracy of the system, as a %
• Profit factor: gross profit/gross loss
• Maximum drawdown: the maximum value from a peak to a successive trough of the system's equity curve
• MAE: Maximum Adverse Excursion. The biggest loss of a trade suffered while the position is still open
• Total trades: total number of closed trades
• Max gain/max loss: shows the biggest win over the biggest loss suffered
• Sharpe ratio: measures the performance of the system with adjusted risk (no comparison to risk-free asset)
• CAGR: Compound Annual Growth Rate. The mean annual rate of growth of the system of n years (provided n>1)
• Kurtosis: measures how heavily the tails of the distribution differ from that of a normal distribution (symmetric on both sides of mean where mean=0, standard deviation=1). A normal distribution has a kurtosis of 3, and skewness of 0. The kurtosis indicates whether or not the tails of the returns contain extreme values
• Skewness: measures the symmetry of the distribution of returns
- Leptokurtic: K > 0. Having more kurtosis than a normal distribution. It's stretched up and to the side too (2nd pic down). High kurtosis (leptokurtic) is bad as the wider tails (called heavy tails) suggest there is relatively high probability of extreme events
- Mesokurtic: K =0. Having the same kurtosis as a normal distribution
- Platykurtic: K < 0. Having less kurtosis than a normal distribution. This suggests there are light tails and fewer extreme events in the distribution
- Skewness is good: +/- 0.5 (fairly symmetrical)
- Skewness is average: -1 to -0.5 or 0.5 to 1 (moderately skewed)
- Skewness is bad: > +/- 1 (highly skewed)
Evolving ROI table:
- The table of ROI values evolve with the year and month. The sum of each year is given. Please avoid using it on non-cryptocurrencies or any market whose trading session is not 24/7
Open ROI reader:
- At the top center is the open ROI of a trade
[ENT] IndicatorsИндикатор показывает:
Открытие и закрытие торговых сессий (KillZones) - Азия, Лондон, Нью-Йорк
Открытие дня
Хай и Лой предыдущего дня
Разделение дней недели и их отображение.
Используйте на здоровье)