Stability Max OverloadStability Max Overload was created in another script I have been working on found below.
I have broken the code down to only display the Stability features.
What this is:
I was trying to find a way that could in some form display the Stability or Instability of the US Treasuries Bond Market. To try and help me do that, I came up with 3 values.
*Stability
*Stability Overload
*Stability Max Overload.
I started with STABILITY. This value is generated based off the number of side by side inversions in the Bond Market. I wanted this value to range between 0 and 1 while 1 equaling all Bonds inverted and 0 equaling no Bonds inverted and any number of inversions in between would equal a percentage value based off the actual number.
STABILITY OVERLOAD was created based off the average of each inversion.
STABILITY MAX OVERLOAD was then created based off the total of each inversion.
The most stable Yield Curve would have no inversions and therefore would generate a 0 for Stability, Stability Overload and Stability Max Overload. The more inversions the Yield Curve has the higher in value Stability itself would have as Stability is weighted more per inversion. With each inversion, data is taken based off the amount with which the Yields are inverted.
This display shows where we currently stand since Dec 2018. It's a telling story so say the least. I do plan on continuing the mentioned above script but again wanted to release a standalone of the data generated.
Hope you enjoy,
OpptionsOnly
Cerca negli script per "curve"
EnsembleX📌 EnsembleX – Multi-Feature Voting Strategy
//@version=5
//@fenyesk
🔹 Overview
EnsembleX is a multi-indicator ensemble trading strategy that combines price action, momentum, volume, and volatility signals into a unified consensus model. Instead of relying on a single indicator, EnsembleX uses a weighted voting system to determine trade entries and exits, making it more adaptive across different market conditions (crypto, forex, and equities).
The system calculates feature-engineered signals, normalizes them, applies lagged context, and then uses ensemble consensus weighting to decide whether to go long or short. An adaptive threshold (ATR-based) ensures risk-sensitive entries during volatile or quiet regimes.
🔹 Core Features
📈 Trend & Momentum Features
EMA Slope (f_slope): Captures directional bias and steepness of trend.
RSI (f_rsi): Measures overbought/oversold conditions with normalization.
CCI (f_cci): Detects price deviations from mean for extreme reversals.
ADX (f_adx, DMI+/-): Evaluates trend strength and directional dominance.
📊 Volatility Features
Standard Deviation (f_stdev): Captures volatility spikes relative to history.
Bollinger Band Position (f_bb): Measures where price sits within BB envelope.
Log Returns (f_logr): Tracks distribution-adjusted price changes.
💵 Volume-Based Features
MFI (f_mfi): Volume-weighted momentum confirming price moves.
Volume Pressure (f_vol): Combines normalized volume ratio with price change.
🧮 Feature Engineering
Normalization & Z-score scaling: Keeps features comparable across regimes.
Lag Features (optional): Adds short-term historical context to signals.
Composite Aggregates:
Momentum Composite (mom): RSI + CCI + MFI blend.
Trend Composite (trd): ADX + Slope blend.
Volatility Composite (volat): StDev + Volume blend.
🔹 Signal Generation
Each feature produces an expert signal (+1 bull, -1 bear, 0 neutral). Examples:
RSI rising from oversold → Bull signal.
ADX strong + DMI+ dominance → Bull signal.
Bollinger Band breakout + reversal → Bear signal.
Volume pressure > threshold → Directional confirmation.
🔹 Ensemble Voting Mechanism
Each signal is assigned a weight (weight_rsi, weight_adx, weight_mfi, etc.).
Final bull/bear confidence is computed as a weighted probability.
Trades trigger only when consensus ≥ threshold.
Threshold adapts dynamically based on ATR / volatility regime.
🔹 Trading Logic
✅ Long Entry:
Bull consensus ≥ threshold and stronger than bear side.
✅ Short Entry:
Bear consensus ≥ threshold and stronger than bull side.
✅ Optional Exits:
Close on opposite signal flip (configurable by position side).
🔹 Visualization
Plots bull and bear confidence curves.
Plots both base threshold and adaptive ATR-adjusted threshold.
Easy to see how consensus builds before trades trigger.
⚡ Key Benefits
Robustness: Reduces reliance on any single indicator.
Flexibility: Works across assets and timeframes (crypto, forex, stocks).
Adaptive: Threshold adjusts automatically in volatile or quiet markets.
Transparency: Plotted consensus and threshold lines make signals easy to interpret.
📢 Usage Notes
Best used on 1h–4h for swing trades, or 5m–15m for intraday setups.
Combine with risk management (TP/SL, position sizing) for live trading.
Ensemble weights (weight_rsi, weight_adx, etc.) can be tuned per asset.
👉 This script is designed for backtesting and research. Results vary depending on the asset, timeframe, and parameter tuning.
EnsembleX📌 EnsembleX – Multi-Feature Voting Strategy
//@version=5
//@fenyesk
🔹 Overview
EnsembleX is a multi-indicator ensemble trading Strategy that combines price action, momentum, volume, and volatility signals into a unified consensus model. Instead of relying on a single indicator, EnsembleX uses a weighted voting system to determine trade entries and exits, making it more adaptive across different market conditions (crypto, forex, and equities).
The system calculates feature-engineered signals, normalizes them, applies lagged context, and then uses ensemble consensus weighting to decide whether to go long or short. An adaptive threshold (ATR-based) ensures risk-sensitive entries during volatile or quiet regimes.
🔹 Core Features
📈 Trend & Momentum Features
EMA Slope (f_slope): Captures directional bias and steepness of trend.
RSI (f_rsi): Measures overbought/oversold conditions with normalization.
CCI (f_cci): Detects price deviations from mean for extreme reversals.
ADX (f_adx, DMI+/-): Evaluates trend strength and directional dominance.
📊 Volatility Features
Standard Deviation (f_stdev): Captures volatility spikes relative to history.
Bollinger Band Position (f_bb): Measures where price sits within BB envelope.
Log Returns (f_logr): Tracks distribution-adjusted price changes.
💵 Volume-Based Features
MFI (f_mfi): Volume-weighted momentum confirming price moves.
Volume Pressure (f_vol): Combines normalized volume ratio with price change.
🧮 Feature Engineering
Normalization & Z-score scaling: Keeps features comparable across regimes.
Lag Features (optional): Adds short-term historical context to signals.
Composite Aggregates:
Momentum Composite (mom): RSI + CCI + MFI blend.
Trend Composite (trd): ADX + Slope blend.
Volatility Composite (volat): StDev + Volume blend.
🔹 Signal Generation
Each feature produces an expert signal (+1 bull, -1 bear, 0 neutral). Examples:
RSI rising from oversold → Bull signal.
ADX strong + DMI+ dominance → Bull signal.
Bollinger Band breakout + reversal → Bear signal.
Volume pressure > threshold → Directional confirmation.
🔹 Ensemble Voting Mechanism
Each signal is assigned a weight (weight_rsi, weight_adx, weight_mfi, etc.).
Final bull/bear confidence is computed as a weighted probability.
Trades trigger only when consensus ≥ threshold.
Threshold adapts dynamically based on ATR / volatility regime.
🔹 Trading Logic
✅ Long Entry:
Bull consensus ≥ threshold and stronger than bear side.
✅ Short Entry:
Bear consensus ≥ threshold and stronger than bull side.
✅ Optional Exits:
Close on opposite signal flip (configurable by position side).
🔹 Visualization
Plots bull and bear confidence curves.
Plots both base threshold and adaptive ATR-adjusted threshold.
Easy to see how consensus builds before trades trigger.
⚡ Key Benefits
Robustness: Reduces reliance on any single indicator.
Flexibility: Works across assets and timeframes (crypto, forex, stocks).
Adaptive: Threshold adjusts automatically in volatile or quiet markets.
Transparency: Plotted consensus and threshold lines make signals easy to interpret.
📢 Usage Notes
Best used on 1h–4h for swing trades, or 5m–15m for intraday setups.
Combine with risk management (TP/SL, position sizing) for live trading.
Ensemble weights (weight_rsi, weight_adx, etc.) can be tuned per asset.
👉 This script is designed for backtesting and research. Results vary depending on the asset, timeframe, and parameter tuning.
Ribbon — multi-MA trend bandsRibbon paints five translucent bands between six moving averages to visualize trend structure and regime at a glance. You can choose the MA type (EMA/SMA/WMA), customize lengths, and switch the coloring logic between an anchor-based mode and strict alignment.
What it shows
Six MAs on the current timeframe (defaults: 5 / 34 / 55 / 89 / 144 / 233).
Five bands filled between consecutive MAs:
5–34, 34–55, 55–89, 89–144, 144–233.
Optional plotting of MA lines (hidden by default to keep the chart clean).
Coloring modes
1. By EMA233 (Anchor mode)
Each band is colored Up or Down by comparing its upper MA to the anchor (the 6th MA in inputs, default length 233).
If MA > anchor → Up color (supportive regime).
If MA < anchor → Down color (resistive regime).
2. By Alignment
All bands share one color depending on strict ordering:
Up if MA1 > MA2 > MA3 > MA4 > MA5 > MA6
Down if MA1 < MA2 < MA3 < MA4 < MA5 < MA6
Gray otherwise (no clean alignment).
Inputs (key)
MA Type : EMA / SMA / WMA (applies to all six MAs).
MA 1…MA 6 (anchor) : lengths for each average (defaults form a classic ribbon up to 233).
Up/Down colors : band palette.
Base transparency / step : controls band opacity gradient (top band uses Base, each next band adds Step).
Show MA lines + Lines transparency : optionally draw the six MA curves.
How to read it
Directional bias : when most bands are green (anchor mode) or the whole ribbon is green (alignment mode), momentum favors the upside; red implies downside pressure.
Quality of trend : a persistent alignment (all ordered) signals a cleaner trend. Mixed/gray suggests chop or transition.
Pullback zones : price returning toward inner bands can mark areas to watch for continuation vs. failure.
Implementation notes
No higher-timeframe data, no lookahead — this is a non-repainting, current-TF visualization.
Bands still render even when MA lines are hidden (the script uses hidden plot anchors under the hood).
This is an indicator , not a strategy — it does not open/close trades or calculate P&L.
Disclaimer
This script is for educational and informational purposes only and does not constitute financial advice. Always test on historical data and manage risk appropriately.
Auto-Fit Growth Trendline# **Theoretical Algorithmic Principles of the Auto-Fit Growth Trendline (AFGT)**
## **🎯 What Does This Algorithm Do?**
The Auto-Fit Growth Trendline is an advanced technical analysis system that **automates the identification of long-term growth trends** and **projects future price levels** based on historical cyclical patterns.
### **Primary Functionality:**
- **Automatically detects** the most significant lows in regular periods (monthly, quarterly, semi-annually, annually)
- **Constructs a dynamic trendline** that connects these historical lows
- **Projects the trend into the future** with high mathematical precision
- **Generates Fibonacci bands** that act as dynamic support and resistance levels
- **Automatically adapts** to different timeframes and market conditions
### **Strategic Purpose:**
The algorithm is designed to identify **fundamental value zones** where price has historically found support, enabling traders to:
- Identify optimal entry points for long positions
- Establish realistic price targets based on mathematical projections
- Recognize dynamic support and resistance levels
- Anticipate long-term price movements
---
## **🧮 Core Mathematical Foundations**
### **Adaptive Temporal Segmentation Theory**
The algorithm is based on **dynamic temporal partition theory**, where time is divided into mathematically coherent uniform intervals. It uses modular transformations to create bijective mappings between continuous timestamps and discrete periods, ensuring each temporal point belongs uniquely to a specific period.
**What does this achieve?** It allows the algorithm to automatically identify natural market cycles (annual, quarterly, etc.) without manual intervention, adapting to the inherent periodicity of each asset.
The temporal mapping function implements a **discrete affine transformation** that normalizes different frequencies (monthly, quarterly, semi-annual, annual) to a space of unique identifiers, enabling consistent cross-temporal comparative analysis.
---
## **📊 Local Extrema Detection Theory**
### **Multi-Point Retrospective Validation Principle**
Local minima detection is founded on **relative extrema theory with sliding window**. Instead of using a simple minimum finder, it implements a cross-validation system that examines the persistence of the extremum across multiple historical periods.
**What problem does this solve?** It eliminates false minima caused by temporal volatility, identifying only those points that represent true historical support levels with statistical significance.
This approach is based on the **statistical confirmation principle**, where a minimum is only considered valid if it maintains its extremum condition during a defined observation period, significantly reducing false positives caused by transitory volatility.
---
## **🔬 Robust Interpolation Theory with Outlier Control**
### **Contextual Adaptive Interpolation Model**
The mathematical core uses **piecewise linear interpolation with adaptive outlier correction**. The key innovation lies in implementing a **contextual anomaly detector** that identifies not only absolute extreme values, but relative deviations to the local context.
**Why is this important?** Financial markets contain extreme events (crashes, bubbles) that can distort projections. This system identifies and appropriately weights them without completely eliminating them, preserving directional information while attenuating distortions.
### **Implicit Bayesian Smoothing Algorithm**
When an outlier is detected (deviation >300% of local average), the system applies a **simplified Kalman filter** that combines the current observation with a local trend estimation, using a weight factor that preserves directional information while attenuating extreme fluctuations.
---
## **📈 Stabilized Extrapolation Theory**
### **Exponential Growth Model with Dampening**
Extrapolation is based on a **modified exponential growth model with progressive dampening**. It uses multiple historical points to calculate local growth ratios, implements statistical filtering to eliminate outliers, and applies a dampening factor that increases with extrapolation distance.
**What advantage does this offer?** Long-term projections in finance tend to be exponentially unrealistic. This system maintains short-to-medium term accuracy while converging toward realistic long-term projections, avoiding the typical "exponential explosions" of other methods.
### **Asymptotic Convergence Principle**
For long-term projections, the algorithm implements **controlled asymptotic convergence**, where growth ratios gradually converge toward pre-established limits, avoiding unrealistic exponential projections while preserving short-to-medium term accuracy.
---
## **🌟 Dynamic Fibonacci Projection Theory**
### **Continuous Proportional Scaling Model**
Fibonacci bands are constructed through **uniform proportional scaling** of the base curve, where each level represents a linear transformation of the main curve by a constant factor derived from the Fibonacci sequence.
**What is its practical utility?** It provides dynamic resistance and support levels that move with the trend, offering price targets and profit-taking points that automatically adapt to market evolution.
### **Topological Preservation Principle**
The system maintains the **topological properties** of the base curve in all Fibonacci projections, ensuring that spatial and temporal relationships are consistently preserved across all resistance/support levels.
---
## **⚡ Adaptive Computational Optimization**
### **Multi-Scale Resolution Theory**
It implements **automatic multi-resolution analysis** where data granularity is dynamically adjusted according to the analysis timeframe. It uses the **adaptive Nyquist principle** to optimize the signal-to-noise ratio according to the temporal observation scale.
**Why is this necessary?** Different timeframes require different levels of detail. A 1-minute chart needs more granularity than a monthly one. This system automatically optimizes resolution for each case.
### **Adaptive Density Algorithm**
Calculation point density is optimized through **adaptive sampling theory**, where calculation frequency is adjusted according to local trend curvature and analysis timeframe, balancing visual precision with computational efficiency.
---
## **🛡️ Robustness and Fault Tolerance**
### **Graceful Degradation Theory**
The system implements **multi-level graceful degradation**, where under error conditions or insufficient data, the algorithm progressively falls back to simpler but reliable methods, maintaining basic functionality under any condition.
**What does this guarantee?** That the indicator functions consistently even with incomplete data, new symbols with limited history, or extreme market conditions.
### **State Consistency Principle**
It uses **mathematical invariants** to guarantee that the algorithm's internal state remains consistent between executions, implementing consistency checks that validate data structure integrity in each iteration.
---
## **🔍 Key Theoretical Innovations**
### **A. Contextual vs. Absolute Outlier Detection**
It revolutionizes traditional outlier detection by considering not only the absolute magnitude of deviations, but their relative significance within the local context of the time series.
**Practical impact:** It distinguishes between legitimate market movements and technical anomalies, preserving important events like breakouts while filtering noise.
### **B. Extrapolation with Weighted Historical Memory**
It implements a memory system that weights different historical periods according to their relevance for current prediction, creating projections more adaptable to market regime changes.
**Competitive advantage:** It automatically adapts to fundamental changes in asset dynamics without requiring manual recalibration.
### **C. Automatic Multi-Timeframe Adaptation**
It develops an automatic temporal resolution selection system that optimizes signal extraction according to the intrinsic characteristics of the analysis timeframe.
**Result:** A single indicator that functions optimally from 1-minute to monthly charts without manual adjustments.
### **D. Intelligent Asymptotic Convergence**
It introduces the concept of controlled asymptotic convergence in financial extrapolations, where long-term projections converge toward realistic limits based on historical fundamentals.
**Added value:** Mathematically sound long-term projections that avoid the unrealistic extremes typical of other extrapolation methods.
---
## **📊 Complexity and Scalability Theory**
### **Optimized Linear Complexity Model**
The algorithm maintains **linear computational complexity** O(n) in the number of historical data points, guaranteeing scalability for extensive time series analysis without performance degradation.
### **Temporal Locality Principle**
It implements **temporal locality**, where the most expensive operations are concentrated in the most relevant temporal regions (recent periods and near projections), optimizing computational resource usage.
---
## **🎯 Convergence and Stability**
### **Probabilistic Convergence Theory**
The system guarantees **probabilistic convergence** toward the real underlying trend, where projection accuracy increases with the amount of available historical data, following **law of large numbers** principles.
**Practical implication:** The more history an asset has, the more accurate the algorithm's projections will be.
### **Guaranteed Numerical Stability**
It implements **intrinsic numerical stability** through the use of robust floating-point arithmetic and validations that prevent overflow, underflow, and numerical error propagation.
**Result:** Reliable operation even with extreme-priced assets (from satoshis to thousand-dollar stocks).
---
## **💼 Comprehensive Practical Application**
**The algorithm functions as a "financial GPS"** that:
1. **Identifies where we've been** (significant historical lows)
2. **Determines where we are** (current position relative to the trend)
3. **Projects where we're going** (future trend with specific price levels)
4. **Provides alternative routes** (Fibonacci bands as alternative targets)
This theoretical framework represents an innovative synthesis of time series analysis, approximation theory, and computational optimization, specifically designed for long-term financial trend analysis with robust and mathematically grounded projections.
ATR RopeATR Rope is inspired by DonovanWall's "Range Filter". It implements a similar concept of filtering out smaller market movements and adjusting only for larger moves. In addition, this indicator goes one step deeper by producing actionable zones to determine market state. (Trend vs. Consolidation)
> Background
When reading up on the Range Filter indicator, it reminded me exactly of a Rope stabilization drawing tool in a program I use frequently. Rope stabilization essentially attaches a fixed length "rope" to your cursor and an anchor point (Brush). As you move your cursor, you are pulling the brush behind it. The cursor (of course) will not pull the brush until the rope is fully extended, this behavior filters out jittery movements and is used to produce smoother drawing curves.
If compared visually side-by-side, you will notice that this indicator bears striking resemblance to its inspiration.
> Goal
Other than simply distinguishing price movements between meaningful and noise, this indicator strives to create a rigid structure to frame market movements and lack-there-of, such as when to anticipate trend, and when to suspect consolidation.
Since the indicator works based on an ATR range, the resulting ATR Channel does well to get reactions from price at its extremes. Naturally, when consolidating, price will remain within the channel, neither pushing the channel significantly up or down. Likewise, when trending, price will continue to push the channel in a single direction.
With the goal of keeping it quick and simple, this indicator does not do any smoothing of data feeds, and is simply based on the deviation of price from the central rope. Adjusting the rope when price extends past the threshold created by +/- ATR from the rope.
> Features & Behaviors
- ATR Rope
ATR Rope is displayed as a 3 color single line.
This can be considered the center line, or the directional line, whichever you'd prefer.
The main point of the Rope display is to indicate direction, however it also is factually the center of the current working range.
- ATR Rope Color
When the rope's value moves up, it changes to green (uptrend), when down, red (downtrend).
When the source crosses the rope, it turns blue (flat).
With these simple rules, we've formed a structure to view market movements.
- Consolidation Zones
Consolidation Zones generate from "Flat" areas, and extend into subsequent trend areas. Consolidation is simply areas where price has crossed the Rope and remains inside the range. Over these periods, the upper and lower values are accumulated and averaged together to form the "Consolidation Zone" values. These zones are draw live, so values are averaged as the flat areas progress and don't repaint, so all values seen historically are as they would appear live.
- ATR Channel
ATR Channel displays the upper and lower bounds of the working range.
When the source moves beyond this range, the rope is adjusted based on the distance from the source to the channel. This range can be extremely useful to view, but by default it is hidden.
> Application
This indicator is not created to provide signals, or serve as a "complete" system.
(People who didn't read this far will still comment for signals. :) )
This is created to be used alongside manual interpretation and intuition. This indicator is not meant to constrain any users into a box, and I would actually encourage an open mind and idea generation, as the application of this indicator can take various forms.
> Examples
As you would probably already know, price movement can be fast impulses, and movement can be slow bleeds. In the screenshot below, we are using movements from and to consolidation zones to classify weak trend and strong trend. As you can see, there are also areas of consolidation which get broken out of and confirmed for the larger moves.
Author's Note: In each of these examples, I have outlined the start and end of each session. These examples come from 1 Min Future charts, and have specifically been framed with day trading in mind.
"Breakout Retest" or "Support/Resistance Flips" or "Structure Retests" are all generally the same thing, with different traders referring to them by different names, all of which can be seen throughout these examples.
In the next example, we have a day which started with an early reversal leading into long, slow, trend. Notice how each area throughout the trend essentially moves slightly higher, then consolidates while holding support of the previous zone. This day had a few sharp movements, however there was a large amount of neutrality throughout this day with continuous higher lows.
In contrast to the previous example, next up, we have a very choppy day. Throughout which we see a significant amount of retests before fast directional movements. We also see a few examples of places where previous zones remained relevant into the future. While the zones only display into the resulting trend area, they do not become immediately meaningless once they stop drawing.
> Abstract
In the screenshot below, I have stacked 2 of these indicators, using the high as the source for one and the low as the source for the other. I've hidden lines of the high and low channels to create a 4 lined channel based on the wicks of price.
This is not necessary to use the indicator, but should help provide an idea of creative ways the simple indicator could be used to produce more complicated analysis.
If you've made it this far, I would hope it's clear to you how this indicator could provide value to your trading.
Thank you to DonovonWall for the inspiration.
Enjoy!
Triple StochasticTriple Stochastic Elasticity Indicator
This custom indicator leverages the power of multi-timeframe analysis by combining three Stochastic Oscillators across different timeframes to identify potential trade entries based on elasticity and divergence between momentum curves.
📊 How It Works:
The indicator plots Stochastic values from three timeframes (e.g., 5m, 15m, and 1h), allowing you to observe how momentum behaves at different scales.
It highlights moments of elasticity—where the Stochastics stretch apart and then begin to converge—potentially signaling a reversion opportunity or trend continuation.
By identifying these stretches and snapbacks in momentum alignment, you can better time your entries and exits with improved confidence.
🔍 Use Case:
Look for divergence or convergence between the Stochastics.
Ideal for trend-following entries, pullback setups, and momentum reversal spotting.
Works best when combined with price action, S/R zones, or volume confirmation.
🛠 Customization:
Timeframes for each Stochastic are fully customizable.
Options to tweak %K, %D, and smoothing values to fit your strategy.
I recommend to remove the D%
And set the following settings
5 : 3 : 3
14 : 3 : 3
56 : 12 :12
Visual alerts can be added for when certain conditions are met (e.g., all three Stochs cross overbought/oversold levels).
Bitcoin Polynomial Regression ModelThis is the main version of the script. Click here for the Oscillator part of the script.
💡Why this model was created:
One of the key issues with most existing models, including our own Bitcoin Log Growth Curve Model , is that they often fail to realistically account for diminishing returns. As a result, they may present overly optimistic bull cycle targets (hence, we introduced alternative settings in our previous Bitcoin Log Growth Curve Model).
This new model however, has been built from the ground up with a primary focus on incorporating the principle of diminishing returns. It directly responds to this concept, which has been briefly explored here .
📉The theory of diminishing returns:
This theory suggests that as each four-year market cycle unfolds, volatility gradually decreases, leading to more tempered price movements. It also implies that the price increase from one cycle peak to the next will decrease over time as the asset matures. The same pattern applies to cycle lows and the relationship between tops and bottoms. In essence, these price movements are interconnected and should generally follow a consistent pattern. We believe this model provides a more realistic outlook on bull and bear market cycles.
To better understand this theory, the relationships between cycle tops and bottoms are outlined below:https://www.tradingview.com/x/7Hldzsf2/
🔧Creation of the model:
For those interested in how this model was created, the process is explained here. Otherwise, feel free to skip this section.
This model is based on two separate cubic polynomial regression lines. One for the top price trend and another for the bottom. Both follow the general cubic polynomial function:
ax^3 +bx^2 + cx + d.
In this equation, x represents the weekly bar index minus an offset, while a, b, c, and d are determined through polynomial regression analysis. The input (x, y) values used for the polynomial regression analysis are as follows:
Top regression line (x, y) values:
113, 18.6
240, 1004
451, 19128
655, 65502
Bottom regression line (x, y) values:
103, 2.5
267, 211
471, 3193
676, 16255
The values above correspond to historical Bitcoin cycle tops and bottoms, where x is the weekly bar index and y is the weekly closing price of Bitcoin. The best fit is determined using metrics such as R-squared values, residual error analysis, and visual inspection. While the exact details of this evaluation are beyond the scope of this post, the following optimal parameters were found:
Top regression line parameter values:
a: 0.000202798
b: 0.0872922
c: -30.88805
d: 1827.14113
Bottom regression line parameter values:
a: 0.000138314
b: -0.0768236
c: 13.90555
d: -765.8892
📊Polynomial Regression Oscillator:
This publication also includes the oscillator version of the this model which is displayed at the bottom of the screen. The oscillator applies a logarithmic transformation to the price and the regression lines using the formula log10(x) .
The log-transformed price is then normalized using min-max normalization relative to the log-transformed top and bottom regression line with the formula:
normalized price = log(close) - log(bottom regression line) / log(top regression line) - log(bottom regression line)
This transformation results in a price value between 0 and 1 between both the regression lines. The Oscillator version can be found here.
🔍Interpretation of the Model:
In general, the red area represents a caution zone, as historically, the price has often been near its cycle market top within this range. On the other hand, the green area is considered an area of opportunity, as historically, it has corresponded to the market bottom.
The top regression line serves as a signal for the absolute market cycle peak, while the bottom regression line indicates the absolute market cycle bottom.
Additionally, this model provides a predicted range for Bitcoin's future price movements, which can be used to make extrapolated predictions. We will explore this further below.
🔮Future Predictions:
Finally, let's discuss what this model actually predicts for the potential upcoming market cycle top and the corresponding market cycle bottom. In our previous post here , a cycle interval analysis was performed to predict a likely time window for the next cycle top and bottom:
In the image, it is predicted that the next top-to-top cycle interval will be 208 weeks, which translates to November 3rd, 2025. It is also predicted that the bottom-to-top cycle interval will be 152 weeks, which corresponds to October 13th, 2025. On the macro level, these two dates align quite well. For our prediction, we take the average of these two dates: October 24th 2025. This will be our target date for the bull cycle top.
Now, let's do the same for the upcoming cycle bottom. The bottom-to-bottom cycle interval is predicted to be 205 weeks, which translates to October 19th, 2026, and the top-to-bottom cycle interval is predicted to be 259 weeks, which corresponds to October 26th, 2026. We then take the average of these two dates, predicting a bear cycle bottom date target of October 19th, 2026.
Now that we have our predicted top and bottom cycle date targets, we can simply reference these two dates to our model, giving us the Bitcoin top price prediction in the range of 152,000 in Q4 2025 and a subsequent bottom price prediction in the range of 46,500 in Q4 2026.
For those interested in understanding what this specifically means for the predicted diminishing return top and bottom cycle values, the image below displays these predicted values. The new values are highlighted in yellow:
And of course, keep in mind that these targets are just rough estimates. While we've done our best to estimate these targets through a data-driven approach, markets will always remain unpredictable in nature. What are your targets? Feel free to share them in the comment section below.
Bitcoin Polynomial Regression OscillatorThis is the oscillator version of the script. Click here for the other part of the script.
💡Why this model was created:
One of the key issues with most existing models, including our own Bitcoin Log Growth Curve Model , is that they often fail to realistically account for diminishing returns. As a result, they may present overly optimistic bull cycle targets (hence, we introduced alternative settings in our previous Bitcoin Log Growth Curve Model).
This new model however, has been built from the ground up with a primary focus on incorporating the principle of diminishing returns. It directly responds to this concept, which has been briefly explored here .
📉The theory of diminishing returns:
This theory suggests that as each four-year market cycle unfolds, volatility gradually decreases, leading to more tempered price movements. It also implies that the price increase from one cycle peak to the next will decrease over time as the asset matures. The same pattern applies to cycle lows and the relationship between tops and bottoms. In essence, these price movements are interconnected and should generally follow a consistent pattern. We believe this model provides a more realistic outlook on bull and bear market cycles.
To better understand this theory, the relationships between cycle tops and bottoms are outlined below:https://www.tradingview.com/x/7Hldzsf2/
🔧Creation of the model:
For those interested in how this model was created, the process is explained here. Otherwise, feel free to skip this section.
This model is based on two separate cubic polynomial regression lines. One for the top price trend and another for the bottom. Both follow the general cubic polynomial function:
ax^3 +bx^2 + cx + d.
In this equation, x represents the weekly bar index minus an offset, while a, b, c, and d are determined through polynomial regression analysis. The input (x, y) values used for the polynomial regression analysis are as follows:
Top regression line (x, y) values:
113, 18.6
240, 1004
451, 19128
655, 65502
Bottom regression line (x, y) values:
103, 2.5
267, 211
471, 3193
676, 16255
The values above correspond to historical Bitcoin cycle tops and bottoms, where x is the weekly bar index and y is the weekly closing price of Bitcoin. The best fit is determined using metrics such as R-squared values, residual error analysis, and visual inspection. While the exact details of this evaluation are beyond the scope of this post, the following optimal parameters were found:
Top regression line parameter values:
a: 0.000202798
b: 0.0872922
c: -30.88805
d: 1827.14113
Bottom regression line parameter values:
a: 0.000138314
b: -0.0768236
c: 13.90555
d: -765.8892
📊Polynomial Regression Oscillator:
This publication also includes the oscillator version of the this model which is displayed at the bottom of the screen. The oscillator applies a logarithmic transformation to the price and the regression lines using the formula log10(x) .
The log-transformed price is then normalized using min-max normalization relative to the log-transformed top and bottom regression line with the formula:
normalized price = log(close) - log(bottom regression line) / log(top regression line) - log(bottom regression line)
This transformation results in a price value between 0 and 1 between both the regression lines.
🔍Interpretation of the Model:
In general, the red area represents a caution zone, as historically, the price has often been near its cycle market top within this range. On the other hand, the green area is considered an area of opportunity, as historically, it has corresponded to the market bottom.
The top regression line serves as a signal for the absolute market cycle peak, while the bottom regression line indicates the absolute market cycle bottom.
Additionally, this model provides a predicted range for Bitcoin's future price movements, which can be used to make extrapolated predictions. We will explore this further below.
🔮Future Predictions:
Finally, let's discuss what this model actually predicts for the potential upcoming market cycle top and the corresponding market cycle bottom. In our previous post here , a cycle interval analysis was performed to predict a likely time window for the next cycle top and bottom:
In the image, it is predicted that the next top-to-top cycle interval will be 208 weeks, which translates to November 3rd, 2025. It is also predicted that the bottom-to-top cycle interval will be 152 weeks, which corresponds to October 13th, 2025. On the macro level, these two dates align quite well. For our prediction, we take the average of these two dates: October 24th 2025. This will be our target date for the bull cycle top.
Now, let's do the same for the upcoming cycle bottom. The bottom-to-bottom cycle interval is predicted to be 205 weeks, which translates to October 19th, 2026, and the top-to-bottom cycle interval is predicted to be 259 weeks, which corresponds to October 26th, 2026. We then take the average of these two dates, predicting a bear cycle bottom date target of October 19th, 2026.
Now that we have our predicted top and bottom cycle date targets, we can simply reference these two dates to our model, giving us the Bitcoin top price prediction in the range of 152,000 in Q4 2025 and a subsequent bottom price prediction in the range of 46,500 in Q4 2026.
For those interested in understanding what this specifically means for the predicted diminishing return top and bottom cycle values, the image below displays these predicted values. The new values are highlighted in yellow:
And of course, keep in mind that these targets are just rough estimates. While we've done our best to estimate these targets through a data-driven approach, markets will always remain unpredictable in nature. What are your targets? Feel free to share them in the comment section below.
DeepSignalFilterHelpersLibrary "DeepSignalFilterHelpers"
filter_intraday_intensity(useIiiFilter)
Parameters:
useIiiFilter (bool)
filter_vwma(src, length, useVwmaFilter)
Parameters:
src (float)
length (int)
useVwmaFilter (bool)
filter_nvi(useNviFilter)
Parameters:
useNviFilter (bool)
filter_emv(length, emvThreshold, useEmvFilter, useMovingAvg)
EMV filter for filtering signals based on Ease of Movement
Parameters:
length (int) : The length of the EMV calculation
emvThreshold (float) : The EMV threshold
useEmvFilter (bool) : Whether to apply the EMV filter
useMovingAvg (bool) : Whether to use moving average as threshold
Returns: Filtered result indicating whether the signal should be used
filter_adi(length, threshold, useAdiFilter, useMovingAvg)
ADI filter for filtering signals based on Accumulation/Distribution Index
Parameters:
length (int) : The length of the ADI moving average calculation
threshold (float) : The ADI threshold
useAdiFilter (bool) : Whether to apply the ADI filter
useMovingAvg (bool) : Whether to use moving average as threshold
Returns: Filtered result indicating whether the signal should be used
filter_mfi(length, mfiThreshold, useMfiFilter, useMovingAvg)
MFI filter for filtering signals based on Money Flow Index
Parameters:
length (int) : The length of the MFI calculation
mfiThreshold (float) : The MFI threshold
useMfiFilter (bool) : Whether to apply the MFI filter
useMovingAvg (bool) : Whether to use moving average as threshold
Returns: Filtered result indicating whether the signal should be used
detect_obv_states(obvThresholdStrong, obvThresholdModerate, lookbackPeriod, obvMode)
detect_obv_states: Identify OBV states with three levels (Strong, Moderate, Weak) over a configurable period
Parameters:
obvThresholdStrong (float) : Threshold for strong OBV movements
obvThresholdModerate (float) : Threshold for moderate OBV movements
lookbackPeriod (int) : Number of periods to analyze OBV trends
obvMode (string) : OBV mode to filter ("Strong", "Moderate", "Weak")
Returns: OBV state ("Strong Up", "Moderate Up", "Weak Up", "Positive Divergence", "Negative Divergence", "Consolidation", "Weak Down", "Moderate Down", "Strong Down")
filter_obv(src, length, obvMode, threshold, useObvFilter, useMovingAvg)
filter_obv: Filter signals based on OBV states
Parameters:
src (float) : The source series (default: close)
length (int) : The length of the OBV moving average calculation
obvMode (string) : OBV mode to filter ("Strong", "Moderate", "Weak")
threshold (float) : Optional threshold for additional filtering
useObvFilter (bool) : Whether to apply the OBV filter
useMovingAvg (bool) : Whether to use moving average as threshold
Returns: Filtered result indicating whether the signal should be used
filter_cmf(length, cmfThreshold, useCmfFilter, useMovingAvg)
CMF filter for filtering signals based on Chaikin Money Flow
Parameters:
length (int) : The length of the CMF calculation
cmfThreshold (float) : The CMF threshold
useCmfFilter (bool) : Whether to apply the CMF filter
useMovingAvg (bool) : Whether to use moving average as threshold
Returns: Filtered result indicating whether the signal should be used
filter_vwap(useVwapFilter)
VWAP filter for filtering signals based on Volume-Weighted Average Price
Parameters:
useVwapFilter (bool) : Whether to apply the VWAP filter
Returns: Filtered result indicating whether the signal should be used
filter_pvt(length, pvtThreshold, usePvtFilter, useMovingAvg)
PVT filter for filtering signals based on Price Volume Trend
Parameters:
length (int) : The length of the PVT moving average calculation
pvtThreshold (float) : The PVT threshold
usePvtFilter (bool) : Whether to apply the PVT filter
useMovingAvg (bool) : Whether to use moving average as threshold
Returns: Filtered result indicating whether the signal should be used
filter_vo(shortLength, longLength, voThreshold, useVoFilter, useMovingAvg)
VO filter for filtering signals based on Volume Oscillator
Parameters:
shortLength (int) : The length of the short-term volume moving average
longLength (int) : The length of the long-term volume moving average
voThreshold (float) : The Volume Oscillator threshold
useVoFilter (bool) : Whether to apply the VO filter
useMovingAvg (bool) : Whether to use moving average as threshold
Returns: Filtered result indicating whether the signal should be used
filter_cho(shortLength, longLength, choThreshold, useChoFilter, useMovingAvg)
CHO filter for filtering signals based on Chaikin Oscillator
Parameters:
shortLength (int) : The length of the short-term ADI moving average
longLength (int) : The length of the long-term ADI moving average
choThreshold (float) : The Chaikin Oscillator threshold
useChoFilter (bool) : Whether to apply the CHO filter
useMovingAvg (bool) : Whether to use moving average as threshold
Returns: Filtered result indicating whether the signal should be used
filter_fi(length, fiThreshold, useFiFilter, useMovingAvg)
FI filter for filtering signals based on Force Index
Parameters:
length (int) : The length of the FI calculation
fiThreshold (float) : The Force Index threshold
useFiFilter (bool) : Whether to apply the FI filter
useMovingAvg (bool) : Whether to use moving average as threshold
Returns: Filtered result indicating whether the signal should be used
filter_garman_klass_volatility(length, useGkFilter)
Parameters:
length (int)
useGkFilter (bool)
filter_frama(src, length, useFramaFilter)
Parameters:
src (float)
length (int)
useFramaFilter (bool)
filter_bollinger_bands(src, length, stdDev, useBollingerFilter)
Parameters:
src (float)
length (int)
stdDev (float)
useBollingerFilter (bool)
filter_keltner_channel(src, length, atrMult, useKeltnerFilter)
Parameters:
src (float)
length (simple int)
atrMult (float)
useKeltnerFilter (bool)
regime_filter(src, threshold, useRegimeFilter)
Regime filter for filtering signals based on trend strength
Parameters:
src (float) : The source series
threshold (float) : The threshold for the filter
useRegimeFilter (bool) : Whether to apply the regime filter
Returns: Filtered result indicating whether the signal should be used
regime_filter_v2(src, threshold, useRegimeFilter)
Regime filter for filtering signals based on trend strength
Parameters:
src (float) : The source series
threshold (float) : The threshold for the filter
useRegimeFilter (bool) : Whether to apply the regime filter
Returns: Filtered result indicating whether the signal should be used
filter_adx(src, length, adxThreshold, useAdxFilter)
ADX filter for filtering signals based on ADX strength
Parameters:
src (float) : The source series
length (simple int) : The length of the ADX calculation
adxThreshold (int) : The ADX threshold
useAdxFilter (bool) : Whether to apply the ADX filter
Returns: Filtered result indicating whether the signal should be used
filter_volatility(minLength, maxLength, useVolatilityFilter)
Volatility filter for filtering signals based on volatility
Parameters:
minLength (simple int) : The minimum length for ATR calculation
maxLength (simple int) : The maximum length for ATR calculation
useVolatilityFilter (bool) : Whether to apply the volatility filter
Returns: Filtered result indicating whether the signal should be used
filter_ulcer(src, length, ulcerThreshold, useUlcerFilter)
Ulcer Index filter for filtering signals based on Ulcer Index
Parameters:
src (float) : The source series
length (int) : The length of the Ulcer Index calculation
ulcerThreshold (float) : The Ulcer Index threshold (default: average Ulcer Index)
useUlcerFilter (bool) : Whether to apply the Ulcer Index filter
Returns: Filtered result indicating whether the signal should be used
filter_stddev(src, length, stdDevThreshold, useStdDevFilter)
Standard Deviation filter for filtering signals based on Standard Deviation
Parameters:
src (float) : The source series
length (int) : The length of the Standard Deviation calculation
stdDevThreshold (float) : The Standard Deviation threshold (default: average Standard Deviation)
useStdDevFilter (bool) : Whether to apply the Standard Deviation filter
Returns: Filtered result indicating whether the signal should be used
filter_macdv(src, shortLength, longLength, signalSmoothing, macdVThreshold, useMacdVFilter)
MACD-V filter for filtering signals based on MACD-V
Parameters:
src (float) : The source series
shortLength (simple int) : The short length for MACD calculation
longLength (simple int) : The long length for MACD calculation
signalSmoothing (simple int) : The signal smoothing length for MACD
macdVThreshold (float) : The MACD-V threshold (default: average MACD-V)
useMacdVFilter (bool) : Whether to apply the MACD-V filter
Returns: Filtered result indicating whether the signal should be used
filter_atr(length, atrThreshold, useAtrFilter)
ATR filter for filtering signals based on Average True Range (ATR)
Parameters:
length (simple int) : The length of the ATR calculation
atrThreshold (float) : The ATR threshold (default: average ATR)
useAtrFilter (bool) : Whether to apply the ATR filter
Returns: Filtered result indicating whether the signal should be used
filter_candle_body_and_atr(length, bodyThreshold, atrThreshold, useFilter)
Candle Body and ATR filter for filtering signals
Parameters:
length (simple int) : The length of the ATR calculation
bodyThreshold (float) : The threshold for candle body size (relative to ATR)
atrThreshold (float) : The ATR threshold (default: average ATR)
useFilter (bool) : Whether to apply the candle body and ATR filter
Returns: Filtered result indicating whether the signal should be used
filter_atrp(length, atrpThreshold, useAtrpFilter)
ATRP filter for filtering signals based on ATR Percentage (ATRP)
Parameters:
length (simple int) : The length of the ATR calculation
atrpThreshold (float) : The ATRP threshold (default: average ATRP)
useAtrpFilter (bool) : Whether to apply the ATRP filter
Returns: Filtered result indicating whether the signal should be used
filter_jma(src, length, phase, useJmaFilter)
Parameters:
src (float)
length (simple int)
phase (float)
useJmaFilter (bool)
filter_cidi(src, rsiLength, shortMaLength, longMaLength, useCidiFilter)
Parameters:
src (float)
rsiLength (simple int)
shortMaLength (int)
longMaLength (int)
useCidiFilter (bool)
filter_rsi(src, length, rsiThreshold, useRsiFilter)
Parameters:
src (float)
length (simple int)
rsiThreshold (float)
useRsiFilter (bool)
filter_ichimoku_oscillator(length, threshold, useFilter)
Ichimoku Oscillator filter for filtering signals based on Ichimoku Oscillator
Parameters:
length (int) : The length of the Ichimoku Oscillator calculation
threshold (float) : The threshold for the filter (default: average Ichimoku Oscillator)
useFilter (bool) : Whether to apply the filter
Returns: Filtered result indicating whether the signal should be used
filter_cmb_composite_index(src, shortLength, longLength, threshold, useFilter)
CMB Composite Index filter for filtering signals based on CMB Composite Index
Parameters:
src (float) : The source series
shortLength (simple int) : The short length for CMB calculation
longLength (simple int) : The long length for CMB calculation
threshold (float) : The threshold for the filter (default: average CMB Composite Index)
useFilter (bool) : Whether to apply the filter
Returns: Filtered result indicating whether the signal should be used
filter_connors_rsi(src, rsiLength, rocLength, streakLength, threshold, useFilter)
Connors RSI filter for filtering signals based on Connors RSI
Parameters:
src (float) : The source series
rsiLength (simple int) : The length for RSI calculation
rocLength (int) : The length for ROC calculation
streakLength (simple int) : The length for streak calculation
threshold (float) : The threshold for the filter (default: average Connors RSI)
useFilter (bool) : Whether to apply the filter
Returns: Filtered result indicating whether the signal should be used
filter_coppock_curve(src, roc1Length, roc2Length, wmaLength, threshold, useFilter)
Coppock Curve filter for filtering signals based on Coppock Curve
Parameters:
src (float) : The source series
roc1Length (int) : The length for the first ROC calculation
roc2Length (int) : The length for the second ROC calculation
wmaLength (int) : The length for the WMA calculation
threshold (float) : The threshold for the filter (default: average Coppock Curve)
useFilter (bool) : Whether to apply the filter
Returns: Filtered result indicating whether the signal should be used
filter_pmo(src, pmoLength, smoothingLength, threshold, useFilter)
DecisionPoint Price Momentum Oscillator filter for filtering signals based on PMO
Parameters:
src (float) : The source series
pmoLength (simple int) : The length for PMO calculation
smoothingLength (simple int) : The smoothing length for PMO
threshold (float) : The threshold for the filter (default: average PMO Oscillator)
useFilter (bool) : Whether to apply the filter
Returns: Filtered result indicating whether the signal should be used
filter_macd(src, shortLength, longLength, signalSmoothing, threshold, useFilter)
MACD filter for filtering signals based on MACD
Parameters:
src (float) : The source series
shortLength (simple int) : The short length for MACD calculation
longLength (simple int) : The long length for MACD calculation
signalSmoothing (simple int) : The signal smoothing length for MACD
threshold (float) : The threshold for the filter (default: average MACD)
useFilter (bool) : Whether to apply the filter
Returns: Filtered result indicating whether the signal should be used
filter_macd_histogram(src, shortLength, longLength, signalSmoothing, threshold, useFilter)
MACD-Histogram filter for filtering signals based on MACD-Histogram
Parameters:
src (float) : The source series
shortLength (simple int) : The short length for MACD calculation
longLength (simple int) : The long length for MACD calculation
signalSmoothing (simple int) : The signal smoothing length for MACD
threshold (float) : The threshold for the filter (default: average MACD-Histogram)
useFilter (bool) : Whether to apply the filter
Returns: Filtered result indicating whether the signal should be used
filter_kst(src, r1, r2, r3, r4, sm1, sm2, sm3, sm4, signalLength, threshold, useFilter)
Pring's Know Sure Thing filter for filtering signals based on KST
Parameters:
src (float) : The source series
r1 (int) : The first ROC length
r2 (int) : The second ROC length
r3 (int) : The third ROC length
r4 (int) : The fourth ROC length
sm1 (int) : The first smoothing length
sm2 (int) : The second smoothing length
sm3 (int) : The third smoothing length
sm4 (int) : The fourth smoothing length
signalLength (int) : The signal line smoothing length
threshold (float) : The threshold for the filter (default: average KST Oscillator)
useFilter (bool) : Whether to apply the filter
Returns: Filtered result indicating whether the signal should be used
filter_special_k(src, r1, r2, r3, r4, sm1, sm2, sm3, sm4, threshold, useFilter)
Pring's Special K filter for filtering signals based on Special K
Parameters:
src (float) : The source series
r1 (int) : The first ROC length
r2 (int) : The second ROC length
r3 (int) : The third ROC length
r4 (int) : The fourth ROC length
sm1 (int) : The first smoothing length
sm2 (int) : The second smoothing length
sm3 (int) : The third smoothing length
sm4 (int) : The fourth smoothing length
threshold (float) : The threshold for the filter (default: average Special K)
useFilter (bool) : Whether to apply the filter
Returns: Filtered result indicating whether the signal should be used
filter_roc_momentum(src, rocLength, momentumLength, threshold, useFilter)
ROC and Momentum filter for filtering signals based on ROC and Momentum
Parameters:
src (float) : The source series
rocLength (int) : The length for ROC calculation
momentumLength (int) : The length for Momentum calculation
threshold (float) : The threshold for the filter (default: average ROC and Momentum)
useFilter (bool) : Whether to apply the filter
Returns: Filtered result indicating whether the signal should be used
filter_rrg_relative_strength(src, length, threshold, useFilter)
RRG Relative Strength filter for filtering signals based on RRG Relative Strength
Parameters:
src (float) : The source series
length (int) : The length for RRG Relative Strength calculation
threshold (float) : The threshold for the filter (default: average RRG Relative Strength)
useFilter (bool) : Whether to apply the filter
Returns: Filtered result indicating whether the signal should be used
filter_alligator(useFilter)
Parameters:
useFilter (bool)
filter_wyckoff(useFilter)
Parameters:
useFilter (bool)
filter_squeeze_momentum(bbLength, bbStdDev, kcLength, kcMult, useFilter)
Parameters:
bbLength (int)
bbStdDev (float)
kcLength (simple int)
kcMult (float)
useFilter (bool)
filter_atr_compression(length, atrThreshold, useFilter)
Parameters:
length (simple int)
atrThreshold (float)
useFilter (bool)
filter_low_volume(length, useFilter)
Parameters:
length (int)
useFilter (bool)
filter_nvi_accumulation(useFilter)
Parameters:
useFilter (bool)
filter_ma_slope(src, length, slopeThreshold, useFilter)
Parameters:
src (float)
length (int)
slopeThreshold (float)
useFilter (bool)
filter_adx_low(len, lensig, adxThreshold, useFilter)
Parameters:
len (simple int)
lensig (simple int)
adxThreshold (int)
useFilter (bool)
filter_choppiness_index(length, chopThreshold, useFilter)
Parameters:
length (int)
chopThreshold (float)
useFilter (bool)
filter_range_detection(length, useFilter)
Parameters:
length (int)
useFilter (bool)
Vesica Piscis Visualization-Secret Geometry-AYNETExplanation
Customization Options:
circle_radius: Adjust the size of the circles.
line_color: Choose the color of the circles.
line_width: Adjust the thickness of the circle lines.
segments: Increase or decrease the smoothness of the circles (higher values make smoother circles but use more computational resources).
Placement:
The first circle is centered at circle1_x and the second is offset horizontally by 2 * circle_radius to ensure their centers intersect each other's circumference.
Intersection Highlight:
The intersection area is visually emphasized with a semi-transparent background (bgcolor), which can be customized or removed if unnecessary.
Smoothness:
The segments input determines how many points are used to create each circle. Higher values create smoother curves.
Adjustments
Ensure the circles fit within the visible chart area by adjusting circle1_x and circle_radius.
If needed, you can add additional features, such as drawing lines to connect the centers or labeling the Vesica Piscis region.
Let me know if you want further refinements or additional features!
DRKMetricsLibrary "DRKMetrics"
TODO: add library description here
curve(disp_ind)
Call function to get a certain curve of your strategy.
Parameters:
disp_ind (string)
Returns: Returns type of curve plot.
cleaner(disp_ind, plot)
Call function to filter out your Strategy plots
Parameters:
disp_ind (string)
plot (float)
cobraTable(option, position)
Assign this function to a random variable to get the "Performance Table"
Parameters:
option (simple string)
position (simple string)
Enhanced Economic Composite with Dynamic WeightEnhanced Economic Composite with Dynamic Weight
Overview of the Indicator :
The "Enhanced Economic Composite with Dynamic Weight" is a comprehensive tool that combines multiple economic indicators, technical signals, and dynamic weighting to provide insights into market and economic health. It adjusts based on current volatility and recession risk, offering a detailed view of market conditions.
What This Indicator Does :
Tracks Economic Health: Uses key economic and market indicators to assess overall market conditions.
Dynamic Weighting: Adjusts the importance of components like stock indices, gold, and bonds based on volatility (VIX) and yield curve inversion.
Technical Signals: Identifies market momentum shifts through key crossovers like the Golden Cross, Death Cross, Silver Cross, and Hospice Cross.
Recession Shading: Marks known recessions for historical context.
Economic Factors Considered :
TIP (Treasury Inflation-Protected Securities): Reflects inflation expectations.
Gold: A safe-haven asset, increases in weight during volatility or rising momentum.
US Dollar Index (DXY): Measures USD strength, fixed weight of 10%, smoothed with EMA.
Commodities (DBC): Indicates global demand; weight increases with momentum or volatility.
Volatility Index (VIX): Reflects market risk, inversely related to market confidence.
Stock Indices (S&P 500, DJIA, NASDAQ, Russell 2000): Represent market performance, with weights reduced during high volatility or negative yield spread.
Yield Spread (10Y - 2Y Treasuries): Predicts recessions; negative spread reduces stock weighting.
Credit Spread (HYG - TLT): Indicates market risk through corporate vs. government bond yields.
How and Why Factors are Weighted:
Stock Indices get more weight in stable markets (low VIX, positive yield spread), while safe-haven assets like gold and bonds gain weight in volatile markets or during yield curve inversions. This dynamic adjustment ensures the composite reflects current market sentiment.
Technical Signals:
Golden Cross: 50 EMA crossing above 200 SMA, signaling bullish momentum.
Death Cross: 50 EMA below 200 SMA, indicating bearish momentum.
Silver Cross: 21 EMA crossing above 50 EMA, plotted only if below the 200-day SMA, signaling potential upside in downtrend conditions.
Hospice Cross: 50 EMA crosses below 21 EMA, plotted only if 21 EMA is below 200 SMA, a leading bearish signal.
Recession Shading:
Recession periods like the Great Recession, Early 2000s Recession, and COVID-19 Recession are shaded to provide historical context.
Benefits of Using This Indicator:
Comprehensive Analysis: Combines economic fundamentals and technical analysis for a full market view.
Dynamic Risk Adjustment: Weights shift between growth and safe-haven assets based on volatility and recession risk.
Early Signals: The Silver Cross and Hospice Cross provide early warnings of potential market shifts.
Recession Forecasting: Helps predict downturns through the yield curve and recession indicators.
Who Can Benefit:
Traders: Identify market momentum shifts early through crossovers.
Long-term Investors: Use recession warnings and dynamic adjustments to protect portfolios.
Analysts: A holistic tool for analyzing both economic trends and market movements.
This indicator helps users navigate varying market conditions by dynamically adjusting based on economic factors and providing early technical signals for market momentum shifts.
Machine Learning Signal FilterIntroducing the "Machine Learning Signal Filter," an innovative trading indicator designed to leverage the power of machine learning to enhance trading strategies. This tool combines advanced data processing capabilities with user-friendly customization options, offering traders a sophisticated yet accessible means to optimize their market analysis and decision-making processes. Importantly, this indicator does not repaint, ensuring that signals remain consistent and reliable after they are generated.
Machine Learning Integration
The "Machine Learning Signal Filter" employs machine learning algorithms to analyze historical price data and identify patterns that may not be immediately apparent through traditional technical analysis. By utilizing techniques such as regression analysis and neural networks, the indicator continuously learns from new data, refining its predictive capabilities over time. This dynamic adaptability allows the indicator to adjust to changing market conditions, potentially improving the accuracy of trading signals.
Key Features and Benefits
Dynamic Signal Generation: The indicator uses machine learning to generate buy and sell signals based on complex data patterns. This approach enables it to adapt to evolving market trends, offering traders timely and relevant insights. Crucially, the indicator does not repaint, providing reliable signals that traders can trust.
Customizable Parameters: Users can fine-tune the indicator to suit their specific trading styles by adjusting settings such as the temporal synchronization and neural pulse rate. This flexibility ensures that the indicator can be tailored to different market environments.
Visual Clarity and Usability: The indicator provides clear visual cues on the chart, including color-coded signals and optional display of signal curves. Users can also customize the table's position and text size, enhancing readability and ease of use.
Comprehensive Performance Metrics: The indicator includes a detailed metrics table that displays key performance indicators such as return rates, trade counts, and win/loss ratios. This feature helps traders assess the effectiveness of their strategies and make data-driven decisions.
How It Works
The core of the "Machine Learning Signal Filter" is its ability to process and learn from large datasets. By applying machine learning models, the indicator identifies potential trading opportunities based on historical data patterns. It uses regression techniques to predict future price movements and neural networks to enhance pattern recognition. As new data is introduced, the indicator refines its algorithms, improving its accuracy and reliability over time.
Use Cases
Trend Following: Ideal for traders seeking to capitalize on market trends, the indicator helps identify the direction and strength of price movements.
Scalping: With its ability to provide quick signals, the indicator is suitable for scalpers aiming for rapid profits in volatile markets.
Risk Management: By offering insights into trade performance, the indicator aids in managing risk and optimizing trade setups.
In summary, the "Machine Learning Signal Filter" is a powerful tool that combines the analytical strength of machine learning with the practical needs of traders. Its ability to adapt and provide actionable insights makes it an invaluable asset for navigating the complexities of financial markets.
The "Machine Learning Signal Filter" is a tool designed to assist traders by providing insights based on historical data and machine learning techniques. It does not guarantee profitable trades and should be used as part of a comprehensive trading strategy. Users are encouraged to conduct their own research and consider their financial situation before making trading decisions. Trading involves significant risk, and it is possible to lose more than the initial investment. Always trade responsibly and be aware of the risks involved.
25-Day Momentum IndexDescription:
The 25-Day Momentum Index (25D MI) is a technical indicator designed to measure the strength and direction of price movements over a 25-day period. Inspired by classic momentum analysis, this indicator helps traders identify trends and potential reversal points in the market.
How It Works:
Momentum Calculation: The 25D MI calculates momentum as the difference between the current closing price and the closing price 25 days ago. This difference provides insights into the market's recent strength or weakness.
Plotting: The indicator plots the Momentum Index as a blue line, showing the raw momentum values. A zero line is also plotted in gray to serve as a reference point for positive and negative momentum.
Highlighting Zones:
Positive Momentum: When the Momentum Index is above zero, it is plotted in green, highlighting positive momentum phases.
Negative Momentum: When the Momentum Index is below zero, it is plotted in red, highlighting negative momentum phases.
Usage:
A rising curve means an increase in upward momentum - if it is above the zero line. A rising curve below the zero line signifies a decrease in downward momentum. By the same token, a falling curve means an increase in downward momentum below the zero line, a decrease in upward momentum above the zero line.
This indicator is ideal for traders looking to complement their strategy with a visual tool that captures the essence of market momentum over a significant period. Use it to enhance your technical analysis and refine your trading decisions.
Relative VolumeHello traders,
"There's nothing new on Wall Street" is an age-old saying that still shows its relevance in modern day financial markets; volume still serves as a valuable tool for any trader just as it did for those that came and succeeded before us; in order to succeed in modern day markets one has to take it up a notch and dabble in complicated topics, like math. Now I dunno about you reader but I’m not keen on sitting around all day just to watch numbers on a screen; it’s pretty important to add some color into your life before it becomes dull but how can someone add colors into their trading toolkit as an aid rather than bother? With a bit of help from 3 other amazing open-source indicators you too can become a statistics enjoyer by combining math and colors to make pattern recognition much more intuitive and offering more peace of mind when trading. “Sir but how?”, glad you didn’t ask, it helps with simplifying statistics, in this case a Gaussian bellcurve
“HUH?”, you say? Alright class, Gaussian bellcurves for math dislikers 101 is in session
- Imagine that we have a bunch of numbers that we want to graph. We could just draw a line and plot the numbers on it, but that might not be very interesting.
- Instead, we can use the shape of a bell to show how many of each number we have.
- Let's say we have a lot of people and we want to graph how tall they are. We would start by making a line from the shortest person to the tallest person, and then we would draw the bell shape around the line.
- The bell shape is called a "Gaussian Bell Curve," and it shows us how many people are a certain height.
- In the middle of the bell, where it's the widest, we have the most people who are about average height. As we move to the sides of the bell, the curve gets lower because there are fewer people who are really tall or really short.
The bell curve discussed is the main idea for the candle coloring component of this indicator as being able to analyze the distribution of an entire dataset, in this case volume, can alert us when volume/participation in the market is away from its average using color, and therefore an opportunity could be present. Fair warning, it’s important to not strictly focus on volume as volume is meant to be confluence to the current structure of the market rather than causing tunnel vision.
Why 3 indicators to combine?
It starts with the RVOL by Mik3Christ3ns3n indicator as the backbone by calculating the average volume over a specified period of time, and then compares each new volume value to this average to determine whether it is above or below the average. The indicator then normalizes the volume data and calculates the z-score/standard deviation to determine whether the volume is within normal range or is an anomaly beyond a specified threshold which can also be set into an alert to aid in eyeing possible opportunities.
The code also includes Candle Coloring by Morty as it calculates a function to get the z-score for the size of the candle's body, and then compares it to the z-score for volume to determine whether the body size is a factor in the price action.
Finally, the code plots the anomalies and the normalized volume data on the chart using the first RVOL indicator mentioned, and colors the bars of the chart based on whether they are within normal range or are anomalies which comes from using code from veryfid's relative volume indicator.
Overall, this custom technical indicator is best used to identify unusual changes in trading volume, which may indicate potential price movements in the underlying.
How about some examples?
This first example is for my scalpers wanting to get in and out but not having much of an idea where or let alone how; using a tool like VWAP can be great for determining the area value to execute mean reversion trades once a speculator spots a colored candle anomaly at standard deviation band. Works best when VWAP is flat as it signals lack of conviction from both bulls and bears
This second example is for my fire and forget intraweek swing traders who want to execute a higher timeframe trend-following bias. A speculator starting 2023 off notices that the negative sentiment around Binance from late last year has quieted down and has conviction in upside after BTC began an uptrend as monthly VWAP (right chart) has began sloping up as well as a rally with momentum shown with the blue colored candle so the trader waits wait for a pullback for entry. On the chart to the left of the 4H the speculator notices a pullback into the area of interest to do business so a limit bid is left to enter for continued upside in Bitcoin through January 2023 just by keeping things simple
That’s really the main purpose of this indicator: simplicity of statistics for confluence using volume
Volume precedes price and price moves only for narrative to follow- why wait for your subjective Twitter timeline to give you a biased narrative to trade when you can use objective analysis by combining statistics and colors to allow for a cleaner execution process
“But what about risk management?” Glad you didn’t ask reader!
One last example then, we meet our trend following trader again feeling euphoric so they know profit taking season is coming soon but wants to leave emotion out of it. How to go about it? Same idea as our last trend following example: we see on the 4h chart to the right side shows Bitcoin lose and trade back within the 2nd standard deviation of quarterly VWAP which is telling our speculator that the uptrend has broken on top of which notices on the 30 minute chart on the left that aggressive market buyers have been steadily absorbed by limit sellers on multiple occasions of retesting 30,500 shown with the green colored candles and volume bars below, time to sell.
Turns out that selling was proactive risk management because price dumped thereafter
Hope this explanation gave you some useful insights on using statistics as colors from cherrypicked examples, remember that just because my examples are cherrypicked doesn’t invalidate these concepts at all as the market only does two things, initiate aggressive auctions and respond passively to auctions. This tool makes for seeing where that initiative aggressive activity is happening much simpler to deduce if others will respond to an anomaly of initiative aggressive activity or if the aggression will continue.
If there’s just one thing you take from this- simplicity above all, cheers and good luck
Fake StrategyTHIS IS A FAKE STRATEGY. PLEASE DO NOT USE THIS FOR TRADING.
Just publishing this to display how easily you can fake backtest results in the strategies. However, there are ways to identify the scams. Let's discuss about major red herrings in a strategy. How to identify them and stay away from them.
Any strategy which proclaims significantly high win rate (such as this) are not practical and can only be achieved via following means
Significantly high risk compared to reward
Trades are set in such a way that profits are taken in small movement whereas stops are significantly farther. By doing this, win rate will surely increase. But, will be picking pennies by risking plenty of capital. General trait of such strategies can be identified by comparing average trade and max drawdown . These kind of strategies will have significantly higher drawdown even though the number of losses are less. For example, 1 losing trade leading to drawdown of 10+% whereas every winning only contributes 0.25%.
We can also see this kind of behaviour in option selling strategies such as 0 and 1 DTE option selling strategies. Here too probability of winning can be pretty high (north of 90%). But, on every winning, you make 1-2% of your capital however on remaining trades, you will lose your complete capital - which leads to overall losing position.
Inducing repainting through code
This strategy is an excellent example of how repainting can be induced via code using request.securities method. There are plenty of ways a strategy or code can be made to repaint. Tradingview user manual has lots of information about repainting. Feel free to read through if you have extra time. If you look at this code, it is very simple to induce repainting in a strategy to make it look like an infinite money printing machine.
High Leverage and lack of usage of margin
Using leverage in pine can show false results. This is because, the strategy engine will not stop when equity goes below 0% until the trade is closed. But, that does not happen in real life. This is the reason why using leverage along with high risk and low reward trades can show false results overall making it look like the strategy is unbeatable. But, when you try to use that in real time, it is likely that account will be blown out.
To understand leverage conditions, please have a look at the strategy property fields - Order Size, Pyramiding, Commission, Slippage, Margin Long/Short.
Curve fitting
If the author claims that strategy will only work on particular set of instrument and particular timeframe, then the strategy is not real. It is curve fitting. Knowingly/Unknowingly author has moulded his strategy to fit what has happened in the past. This is general issue even non malicious author go through. It is very much essential to test the strategy across various set of instruments and timeframes to understand the real capability. Use back-testing as test cases. More test cases you have, more bug free your strategy will be. There are many methods to understand curve fitting and perform better testing of the strategy in hand which can be studied and implemented by authors.
Significantly short trades - a sign of lack of strategy
A strategy built using pine in general work on close of candle. So, all the calculations generally happen upon close of the candle. You can force intra-bar calculations using bar magnifier. But, that is not equivalent to tick data. Due to this reason, I consider any trade happening within a bar (Meaning open and close within the same bar) as not reliable. This is because, it is not possible for strategy back-tester to know whether entry condition is satisfied first or exit in a completely foolproof way. Bar magnifier can help reduce this issue - but will not eradicate this problem completely. If there are lots of trades in a strategy - which are closing within the same bar, this is very likely that the strategy backtest results are not reliable.
Hope this helps at least some people to understand the scams and stay away from it.
Multi TF Trend Indicator
...Mark Douglas in his book Trading in the Zone wrote
The longer the time frame, the more significant the trend, so a trending market on a daily bar chart is more significant than a trending market on a 30-minute bar chart. Therefore, the trend on the daily bar chart would take precedence over the trend on the 30-minute bar chart and would be considered the major trend. To determine the direction of the major trend, look at what is happening on a daily bar chart. If the trend is up on the daily, you are only going to look for a sell-off or retracement down to what your edge defines as support on the 30-minute chart. That's where you will become a buyer. On the other hand, if the trend is down on the daily, you are only going to look for a rally up to what your edge defines as a resistance level to be a seller on the 30-minute chart. Your objective is to determine, in a downtrending market, how far it can rally on an intraday basis and still not violate the symmetry of the longer trend. In an up-trending market, your objective is to determine how far it can sell off on an intraday basis without violating the symmetry of the longer trend. There's usually very little risk associated with these intraday support and resistance points, because you don't have to let the market go very far beyond them to tell you the trade isn't working.
The purpose of this indicator to show both the major and minor trend on the same chart with no need to switch between timeframes
Script includes
timeframe to determine the major trend
price curve, close price is default, but you can pick MA you want
type of coloring, either curve color or the background color
Implementation details
major trend is determined by the slope of the price curve
Further improvements
a variation of techniques for determining the major trend (crossing MA, pivot points etc.)
major trend change alerts
Thanks @loxx for pullData helper function
Many Moving AveragesA smooth looking indicator created from a mix of ALMA and LRC curves. Includes alternative calculation for both which I came up with through trial and error so a variety of combinations work to varying degrees. Just something I was playing around with that looked pretty nice in the end.
One-Sided Gaussian Filter w/ Channels [Loxx]One-Sided Gaussian Filter w/ Channels is a Gaussian Moving Average that is calculated using a Fibonacci weighting function. Keltner channels have been added to show zones of exhaustion. A better name would be "Half Gaussian bell weighted" or "Half normal distribution weighted" indicator, since the weights for calculation of the average (similar to linear weighted average) are taken from a normal distribution curve like function--but only the half of the curve is used for calculation.
Information of the Gaussian distribution can be found here : en.wikipedia.org and once you take a look at the standard normal distribution curve, it will be much clearer what is exactly done in this indicator.
After the Gaussian Filter is applied to the source input, an Ehlers' 2-Pole Super Smoother is applied to reduce noise without significant lag.
Included:
Bar coloring
Signals
Alerts
Loxx's Expanded Source Types
Bitcoin Power Law Bands (BTC Power Law) Indicator█ OVERVIEW
The 'Bitcoin Power Law Bands' indicator is a set of three US dollar price trendlines and two price bands for bitcoin , indicating overall long-term trend, support and resistance levels as well as oversold and overbought conditions. The magnitude and growth of the middle (Center) line is determined by double logarithmic (log-log) regression on the entire USD price history of bitcoin . The upper (Resistance) and lower (Support) lines follow the same trajectory but multiplied by respective (fixed) factors. These two lines indicate levels where the price of bitcoin is expected to meet strong long-term resistance or receive strong long-term support. The two bands between the three lines are price levels where bitcoin may be considered overbought or oversold.
All parameters and visuals may be customized by the user as needed.
█ CONCEPTS
Long-term models
Long-term price models have many challenges, the most significant of which is getting the growth curve right overall. No one can predict how a certain market, asset class, or financial instrument will unfold over several decades. In the case of bitcoin , price history is very limited and extremely volatile, and this further complicates the situation. Fortunately for us, a few smart people already had some bright ideas that seem to have stood the test of time.
Power law
The so-called power law is the only long-term bitcoin price model that has a chance of survival for the years ahead. The idea behind the power law is very simple: over time, the rapid (exponential) initial growth cannot possibly be sustained (see The seduction of the exponential curve for a fun take on this). Year-on-year returns, therefore, must decrease over time, which leads us to the concept of diminishing returns and the power law. In this context, the power law translates to linear growth on a chart with both its axes scaled logarithmically. This is called the log-log chart (as opposed to the semilog chart you see above, on which only one of the axes - price - is logarithmic).
Log-log regression
When both price and time are scaled logarithmically, the power law leads to a linear relationship between them. This in turn allows us to apply linear regression techniques, which will find the best-fitting straight line to the data points in question. The result of performing this log-log regression (i.e. linear regression on a log-log scaled dataset) is two parameters: slope (m) and intercept (b). These parameters fully describe the relationship between price and time as follows: log(P) = m * log(T) + b, where P is price and T is time. Price is measured in US dollars , and Time is counted as the number of days elapsed since bitcoin 's genesis block.
DPC model
The final piece of our puzzle is the Dynamic Power Cycle (DPC) price model of bitcoin . DPC is a long-term cyclic model that uses the power law as its foundation, to which a periodic component stemming from the block subsidy halving cycle is applied dynamically. The regression parameters of this model are re-calculated daily to ensure longevity. For the 'Bitcoin Power Law Bands' indicator, the slope and intercept parameters were calculated on publication date (March 6, 2022). The slope of the Resistance Line is the same as that of the Center Line; its intercept was determined by fitting the line onto the Nov 2021 cycle peak. The slope of the Support Line is the same as that of the Center Line; its intercept was determined by fitting the line onto the Dec 2018 trough of the previous cycle. Please see the Limitations section below on the implications of a static model.
█ FEATURES
Inputs
• Parameters
• Center Intercept (b) and Slope (m): These log-log regression parameters control the behavior of the grey line in the middle
• Resistance Intercept (b) and Slope (m): These log-log regression parameters control the behavior of the red line at the top
• Support Intercept (b) and Slope (m): These log-log regression parameters control the behavior of the green line at the bottom
• Controls
• Plot Line Fill: N/A
• Plot Opportunity Label: Controls the display of current price level relative to the Center, Resistance and Support Lines
Style
• Visuals
• Center: Control, color, opacity, thickness, price line control and line style of the Center Line
• Resistance: Control, color, opacity, thickness, price line control and line style of the Resistance Line
• Support: Control, color, opacity, thickness, price line control and line style of the Support Line
• Plots Background: Control, color and opacity of the Upper Band
• Plots Background: Control, color and opacity of the Lower Band
• Labels: N/A
• Output
• Labels on price scale: Controls the display of current Center, Resistance and Support Line values on the price scale
• Values in status line: Controls the display of current Center, Resistance and Support Line values in the indicator's status line
█ HOW TO USE
The indicator includes three price lines:
• The grey Center Line in the middle shows the overall long-term bitcoin USD price trend
• The red Resistance Line at the top is an indication of where the bitcoin USD price is expected to meet strong long-term resistance
• The green Support Line at the bottom is an indication of where the bitcoin USD price is expected to receive strong long-term support
These lines envelope two price bands:
• The red Upper Band between the Center and Resistance Lines is an area where bitcoin is considered overbought (i.e. too expensive)
• The green Lower Band between the Support and Center Lines is an area where bitcoin is considered oversold (i.e. too cheap)
The power law model assumes that the price of bitcoin will fluctuate around the Center Line, by meeting resistance at the Resistance Line and finding support at the Support Line. When the current price is well below the Center Line (i.e. well into the green Lower Band), bitcoin is considered too cheap (oversold). When the current price is well above the Center Line (i.e. well into the red Upper Band), bitcoin is considered too expensive (overbought). This idea alone is not sufficient for profitable trading, but, when combined with other factors, it could guide the user's decision-making process in the right direction.
█ LIMITATIONS
The indicator is based on a static model, and for this reason it will gradually lose its usefulness. The Center Line is the most durable of the three lines since the long-term growth trend of bitcoin seems to deviate little from the power law. However, how far price extends above and below this line will change with every halving cycle (as can be seen for past cycles). Periodic updates will be needed to keep the indicator relevant. The user is invited to adjust the slope and intercept parameters manually between two updates of the indicator.
█ RAMBLINGS
The 'Bitcoin Power Law Bands' indicator is a useful tool for users wishing to place bitcoin in a macro context. As described above, the price level relative to the three lines is a rough indication of whether bitcoin is over- or undervalued. Users wishing to gain more insight into bitcoin price trends may follow the author's periodic updates of the DPC model (contact information below).
█ NOTES
The author regularly posts on Twitter using the @DeFi_initiate handle.
█ THANKS
Many thanks to the following individuals, who - one way or another - made the 'Bitcoin Power Law Bands' indicator possible:
• TradingView user 'capriole_charles', whose open-source 'Bitcoin Power Law Corridor' script was the basis for this indicator
• Harold Christopher Burger, whose Bitcoin’s natural long-term power-law corridor of growth article (2019) was the basis for the 'Bitcoin Power Law Corridor' script
• Bitcoin Forum user "Trololo", who posted the original power law model at Logarithmic (non-linear) regression - Bitcoin estimated value (2014)
Grid Bot AutoThis script is an auto-adjusting grid bot simulator. This is an improved version of the original Grid Bot Simulator. The grid bot is best used for ranging/choppy markets. Prices are divided into grids, or trade zones, that will trigger signals each time a new zone is entered. During ranging markets, each transaction is followed by a “take profit.” As the market starts to trend, transactions are stacked (compare to DCA ), until the market consolidates. No signals are triggered above the Upper Limit or Below the Lower Limit. Unlike the previous version, the upper and lower limits are calculated automatically. Grid levels are determined by four factors: Smoothing, Laziness, Elasticity, and Grid Intervals.
Smoothing:
A moving average (or linear regression) is applied to each close price as a basis. Options for smoothing are Linear Regression, Simple Moving Average, Exponential Moving Average, Volume-Weighted Moving Average, Triple-Exponential Moving Average.
Laziness:
Laziness is the percentage change required to reach the next level. If laziness is 1.5, the price must move up or down by 1.5% before the grid will change. This concept is based on Alex Grover’s Efficient Trend Step. This allows the grids to be based on even price levels, as opposed to jagged moving averages.
Elasticity:
Elasticity is the degree of “stickiness” to the current price trend. If the smoothing line remains above (or below) the current grid center without reverting but still not enough to reach the next grid level, the grid line will start to curve toward the next grid level. Elasticity is added to (or subtracted from) the gridline by a factor of minimum system ticks for the current pair. Elasticity of zero will keep the gridlines horizontal. If elasticity is too high, the grid will distort.
Grid Intervals:
Grid intervals are the percentage of space between each grid.
Laziness = 4%, Elasticity = 0. Price must move at least 4% before reaching the next level. With zero elasticity, gridlines are straight.
Laziness = 5%, Elasticity = 100. For each bar at a new grid level, the grid will start “curve” toward the next price level (up if price is greater than the middle grid, down if less than middle grid). Elasticity is calculated by the user-inputted “Elasticity” multiplied by the minimum tick for the current pair (ELSTX = syminfo.mintick * iELSTX)
Try experimenting with different combinations of the Smoothing Length, Smoothing Type, Laziness, Elasticity, and Grid Intervals to find the optimum settings for each chart. Lower-priced pairs (e.g. XRP/ADA/DODGE) will require lower Elasticity. Also note that different exchanges may have different minimum tick values. For example, minimum tick for BITMEX:XBTUSD and BYBIT:BTCUSD is .5, but BINANCE:BTCUSDT and COINBASE:BTCUSD is .01.
s3.tradingview.com
DODGEUSDT, 5min. Laziness: 4%, Elasticity 2.5
Number of Grids: 2. Laziness: 3.75%. Elasticity: 150. Grid Interval 2%.
Settings Overview
Smoothing Length : Smoothing period
Smoothing Type : Linear Regression, Simple Moving Average, Exponential Moving Average, Volume-Weighted Moving Average, Triple-Exponential Moving Average
Laziness : Percentage required for price to move until it reaches the next level. If price does not reach the next level (up or down), the grid will remain the same as previous grid (because it’s lazy).
Elasticity : Amount of curvature toward the next grid, based on the current price trend. As elasticity increases, gridlines will curve up or down by a factor of the number of ticks since the last grid change.
Grid Interval : Percent between grid levels.
Number of Grids : Number of grids to show.
Cooldown : Number of bars to wait to prevent consecutive signals.
Grid Line Transparency : Lower transparencies brighten the gridlines; higher transparencies dim the gridlines. To hide the gridlines completely, enter 100.
Fill Transparency: Lower transparencies brighten the fill box; higher transparencies dim the fill box. To hide the fill box completely, enter 100.
Signal Size : Make signal triangles large or small.
Reset Buy/Sell Index When Grids Change : When a new grid is formed, resetting the index may prevent false signals (experimental)
Use Highs/Lows for Signals : If enabled, signals are triggered as soon as the price touches the next zone. If disabled, signals are triggered after bar closes. Enable this for “Once Per Bar alerts. Disable for “Once Per Bar Close” alerts.
Show Min Tick : If checked, syminfo.mintick is displayed in upper-righthand corner. Useful for estimating Laziness.
Reverse Fill Colors : Default fill for fill boxes is green after buy and red after sell. Check this box to reverse.
Note: The Grid Bot Simulator scripts are experimental and works in progress. Please feel free to comment or contact me if you have suggestions/complaints.
Raff Regression Channel by DGTRᴀꜰꜰ Rᴇɢʀᴇꜱꜱɪᴏɴ Cʜᴀɴɴᴇʟ (RRC)
This study aims to automate Raff Regression Channel drawing either based on ZigZag Indicator or optionally User Preference
The Raff Regression Channel , developed by Gilbert Raff, is based on a linear regression, which is the least-squares line-of-best-fit for a price series, with evenly spaced trend lines above and below . The width of the channel is set by determining the high or low that is the furthest from the linear regression.
Because the channel distance is based off the largest pullback or highest peak within a trend, for effectively drawing and using a Raff Regression Channel it is recommend/required that a Raff Regression Channel is applied to “mature” trends. Knowing this requirement, for better automated drawing results this study benefits from the Zig Zag Indicator, where the Zig Zag indicator is used to help identify price trends and changes in price trends. Option to manually adjust lengths for drawing a Raff Regression Channel is also made available.
Using a Raff Regression Channel
Once The Raff Regression Channel is drawn, covering an existing trend, Exᴛᴇɴꜱɪᴏɴ Lɪɴᴇꜱ are drawn to identify ᴛʜᴇ ꜱᴜᴘᴘᴏʀᴛ﹐ʀᴇꜱɪꜱᴛᴀɴᴄᴇ ᴏʀ ʀᴇᴠᴇʀꜱᴀʟ ᴘᴏɪɴᴛꜱ
The trend is up as long as prices rise within this channel. An uptrend may be reversing (not always, but likely) when price breaks below the channel extension . The trend is down as long as prices decline within the channel. Similarly, a downtrend may be reversing (not always, but likely) when price breaks above the channel extension . Moves outside the channel extensions can be indication of a reversal or can denote overbought or oversold conditions
For further details please refer to education post Raff Regression Channel
█ FEATURES
- AUTO or MANUALLY adjusted Raff Regression Channel and Channel Extentions drawing
- ALERTs, for Linear Regression Line, Raff Regression Upper and Lower Channel Extentions
- LSMA , Least Squares Moving Average, in other words Linear Regression Curve
█ SETTINGS
Setting Loopback and Number of Bars are the most important part for The Raff Regression Channel, where ;
- Lookback, defines where the Raff Regression Channel is starting, it is recommended to set to a trend begining
- Number of Bars, defines how many bars to be assumed for calculation, or simply stated the end of the Raff Regression Channel drawing (not extentions but the main channel, extentions by default will be drawn till the last bar)
Setting of Loopback and Number of Bars is performed eigher automatically based on Zig Zag indicator or users may prefer to set them manually. If selected automatically then
- Deviation and Depth values of Zig Zag indicator are used for calculations (enabling visually plotting of ZigZag Lines will help to identify better visually the points), where ;
Deviation, is a multiplier that affects how much the price should deviate from the previous pivot in order for the bar to become a new pivot.
Depth, affects the minimum number of bars that will be taken into account when building
Short-term traders may wish to apply the channel to small waves of a trend so they can reduce the value of the Deviation and Depth
█ OTHER CHANNEL CONSEPTS
Linear Regression Channels, , what linear regression channels are? and linear regression channel/curve/slope study
Fibonacci Channels, how to apply fibonacci channels and automated fibonacci channels study
Andrews’ Pitchfork, how to apply pitchfork and automated pitchfork study
Special Thanks to @Kiss66000 for his kind suggestion, je vous remercie beaucoup @Kiss66000
Disclaimer :
Trading success is all about following your trading strategy and the indicators should fit within your trading strategy, and not to be traded upon solely
The script is for informational and educational purposes only. Use of the script does not constitute professional and/or financial advice. You alone have the sole responsibility of evaluating the script output and risks associated with the use of the script. In exchange for using the script, you agree not to hold dgtrd TradingView user liable for any possible claim for damages arising from any decision you make based on use of the script