lastguru

kNN

lastguru Cập nhật   
Library "kNN"
Collection of experimental kNN functions. This is a work in progress, an improvement upon my original kNN script:
The script can be recreated with this library. Unlike the original script, that used multiple arrays, this has been reworked with the new Pine Script matrix features.

To make a kNN prediction, the following data should be supplied to the wrapper:
  • kNN: filter type. Right now either Binary or Percent. Binary works like in the original script: the system stores whether the price has increased (+1) or decreased (-1) since the previous knnStore event (called when either long or short condition is supplied). Percent works the same, but the values stored are the difference of prices in percents. That way larger differences in prices would give higher scores.
  • k: number k. This is how many nearest neighbors are to be selected (and summed up to get the result).
  • skew: kNN minimum difference. Normally, the prediction is done with a simple majority of the neighbor votes. If skew is given, then more than a simple majority is needed for a prediction. This also means that there are inputs for which no prediction would be given (if the majority votes are between -skew and +skew). Note that in Percent mode more profitable trades will have higher voting power.
  • depth: kNN matrix size limit. Originally, the whole available history of trades was used to make a prediction. This not only requires more computational power, but also neglects the fact that the market conditions are changing. This setting restricts the memory matrix to a finite number of past trades.
  • price: price series
  • long: long condition. True if the long conditions are met, but filters are not yet applied. For example, in my original script, trades are only made on crossings of fast and slow MAs. So, whenever it is possible to go long, this value is set true. False otherwise.
  • short: short condition. Same as long, but for short condition.
  • store: whether the inputs should be stored. Additional filters may be applied to prevent bad trades (for example, trend-based filters), so if you only need to consult kNN without storing the trade, this should be set to false.
  • feature1: current value of feature 1. A feature in this case is some kind of data derived from the price. Different features may be used to analyse the price series. For example, oscillator values. Not all of them may be used for kNN prediction. As the current kNN implementation is 2-dimensional, only two features can be used.
  • feature2: current value of feature 2.

The wrapper returns a tuple: [longOK, shortOK]. This is a pair of filters. When longOK is true, then kNN predicts a long trade may be taken. When shortOK is true, then kNN predicts a short trade may be taken. The kNN filters are returned whenever long or short conditions are met. The trade is supposed to happen when long or short conditions are met and when the kNN filter for the desired direction is true.

Exported functions:

knnStore(knn, p1, p2, src, maxrows)
  Store the previous trade; buffer the current one until results are in. Results are binary: up/down
  Parameters:
    knn: knn matrix
    p1: feature 1 value
    p2: feature 2 value
    src: current price
    maxrows: limit the matrix size to this number of rows (0 of no limit)
  Returns: modified knn matrix

knnStorePercent(knn, p1, p2, src, maxrows)
  Store the previous trade; buffer the current one until results are in. Results are in percents
  Parameters:
    knn: knn matrix
    p1: feature 1 value
    p2: feature 2 value
    src: current price
    maxrows: limit the matrix size to this number of rows (0 of no limit)
  Returns: modified knn matrix

knnGet(distance, result)
  Get neighbours by getting k results with the smallest distances
  Parameters:
    distance: distance array
    result: result array
  Returns: array slice of k results

knnDistance(knn, p1, p2)
  Create a distance array from the two given parameters
  Parameters:
    knn: knn matrix
    p1: feature 1 value
    p2: feature 2 value
  Returns: distance array

knnSum(knn, p1, p2, k)
  Make a prediction, finding k nearest neighbours and summing them up
  Parameters:
    knn: knn matrix
    p1: feature 1 value
    p2: feature 2 value
    k: sum k nearest neighbors
  Returns: sum of k nearest neighbors

doKNN(kNN, k, skew, depth, price, long, short, store, feature1, feature2)
  execute kNN filter
  Parameters:
    kNN: filter type
    k: number k
    skew: kNN minimum difference
    depth: kNN matrix size limit
    price: series
    long: long condition
    short: short condition
    store: store the supplied features (if false, only checks the results without storage)
    feature1: feature 1 value
    feature2: feature 2 value
  Returns: filter output
Phát hành các Ghi chú:
v2: bugfix

Tips in TradingView Coins are appreciated
Thư viện Pine

Với tinh thần TradingView thực sự, tác giả đã xuất bản mã Pine này như một thư viện mã nguồn mở để các lập trình viên Pine khác từ cộng đồng của chúng tôi có thể sử dụng lại nó. Chúc mừng tác giả! Bạn có thể sử dụng thư viện này một cách riêng tư hoặc trong các ấn phẩm mã nguồn mở khác, nhưng việc sử dụng lại mã này trong một ấn phẩm chịu sự điều chỉnh của Nội quy chung.

Thông báo miễn trừ trách nhiệm

Thông tin và ấn phẩm không có nghĩa là và không cấu thành, tài chính, đầu tư, kinh doanh, hoặc các loại lời khuyên hoặc khuyến nghị khác được cung cấp hoặc xác nhận bởi TradingView. Đọc thêm trong Điều khoản sử dụng.

Bạn muốn sử dụng thư viện này?

Sao chép văn bản vào khay nhớ tạm và dán nó vào tập lệnh của bạn.