Skip to content

Commit

Permalink
Merge pull request #9348 from freqtrade/new_release
Browse files Browse the repository at this point in the history
New release 2023.10
  • Loading branch information
xmatthias committed Oct 31, 2023
2 parents e73f215 + c98c6c3 commit f142abf
Show file tree
Hide file tree
Showing 57 changed files with 4,669 additions and 1,557 deletions.
6 changes: 3 additions & 3 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,7 @@ jobs:
- name: Run Ruff
run: |
ruff check --format=github .
ruff check --output-format=github .
- name: Mypy
run: |
Expand Down Expand Up @@ -217,7 +217,7 @@ jobs:
- name: Run Ruff
run: |
ruff check --format=github .
ruff check --output-format=github .
- name: Mypy
run: |
Expand Down Expand Up @@ -287,7 +287,7 @@ jobs:
- name: Run Ruff
run: |
ruff check --format=github .
ruff check --output-format=github .
- name: Mypy
run: |
Expand Down
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,10 +15,10 @@ repos:
additional_dependencies:
- types-cachetools==5.3.0.6
- types-filelock==3.2.7
- types-requests==2.31.0.4
- types-requests==2.31.0.10
- types-tabulate==0.9.0.3
- types-python-dateutil==2.8.19.14
- SQLAlchemy==2.0.21
- SQLAlchemy==2.0.22
# stages: [push]

- repo: https://github.com/pycqa/isort
Expand All @@ -30,7 +30,7 @@ repos:

- repo: https://github.com/charliermarsh/ruff-pre-commit
# Ruff version.
rev: 'v0.0.270'
rev: 'v0.1.1'
hooks:
- id: ruff

Expand Down
2 changes: 1 addition & 1 deletion docker/Dockerfile.armhf
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ WORKDIR /freqtrade
# Install dependencies
FROM base as python-deps
RUN apt-get update \
&& apt-get -y install build-essential libssl-dev libffi-dev libgfortran5 pkg-config cmake gcc \
&& apt-get -y install build-essential libssl-dev libffi-dev libopenblas-dev libgfortran5 pkg-config cmake gcc \
&& apt-get clean \
&& pip install --upgrade pip \
&& echo "[global]\nextra-index-url=https://www.piwheels.org/simple" > /etc/pip.conf
Expand Down
4 changes: 2 additions & 2 deletions docs/backtesting.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,9 +31,9 @@ optional arguments:
Specify timeframe (`1m`, `5m`, `30m`, `1h`, `1d`).
--timerange TIMERANGE
Specify what timerange of data to use.
--data-format-ohlcv {json,jsongz,hdf5}
--data-format-ohlcv {json,jsongz,hdf5,feather,parquet}
Storage format for downloaded candle (OHLCV) data.
(default: `json`).
(default: `feather`).
--max-open-trades INT
Override the value of the `max_open_trades`
configuration setting.
Expand Down
37 changes: 37 additions & 0 deletions docs/exchanges.md
Original file line number Diff line number Diff line change
Expand Up @@ -136,6 +136,43 @@ Freqtrade will not attempt to change these settings.
The Kraken API does only provide 720 historic candles, which is sufficient for Freqtrade dry-run and live trade modes, but is a problem for backtesting.
To download data for the Kraken exchange, using `--dl-trades` is mandatory, otherwise the bot will download the same 720 candles over and over, and you'll not have enough backtest data.

To speed up downloading, you can download the [trades zip files](https://support.kraken.com/hc/en-us/articles/360047543791-Downloadable-historical-market-data-time-and-sales-) kraken provides.
These are usually updated once per quarter. Freqtrade expects these files to be placed in `user_data/data/kraken/trades_csv`.

A structure as follows can make sense if using incremental files, with the "full" history in one directory, and incremental files in different directories.
The assumption for this mode is that the data is downloaded and unzipped keeping filenames as they are.
Duplicate content will be ignored (based on timestamp) - though the assumption is that there is no gap in the data.

This means, if your "full" history ends in Q4 2022 - then both incremental updates Q1 2023 and Q2 2023 are available.
Not having this will lead to incomplete data, and therefore invalid results while using the data.

```
└── trades_csv
    ├── Kraken_full_history
   │   ├── BCHEUR.csv
   │   └── XBTEUR.csv
   ├── Kraken_Trading_History_Q1_2023
   │   ├── BCHEUR.csv
   │   └── XBTEUR.csv
   └── Kraken_Trading_History_Q2_2023
      ├── BCHEUR.csv
      └── XBTEUR.csv
```

You can convert these files into freqtrade files:

``` bash
freqtrade convert-trade-data --exchange kraken --format-from kraken_csv --format-to feather
# Convert trade data to different ohlcv timeframes
freqtrade trades-to-ohlcv -p BTC/EUR BCH/EUR --exchange kraken -t 1m 5m 15m 1h
```

The converted data also makes downloading data possible, and will start the download after the latest loaded trade.

``` bash
freqtrade download-data --exchange kraken --dl-trades -p BTC/EUR BCH/EUR
```

!!! Warning "Downloading data from kraken"
Downloading kraken data will require significantly more memory (RAM) than any other exchange, as the trades-data needs to be converted into candles on your machine.
It will also take a long time, as freqtrade will need to download every single trade that happened on the exchange for the pair / timerange combination, therefore please be patient.
Expand Down
34 changes: 32 additions & 2 deletions docs/recursive-analysis.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,11 +40,41 @@ usage: freqtrade recursive-analysis [-h] [-v] [--logfile FILE] [-V] [-c PATH]
[--startup-candle STARTUP_CANDLES [STARTUP_CANDLES ...]]
optional arguments:
-p PAIR, --pairs PAIR
-h, --help show this help message and exit
-i TIMEFRAME, --timeframe TIMEFRAME
Specify timeframe (`1m`, `5m`, `30m`, `1h`, `1d`).
--data-format-ohlcv {json,jsongz,hdf5,feather,parquet}
Storage format for downloaded candle (OHLCV) data.
(default: `feather`).
-p PAIR, --pairs PAIR
Limit command to this pair.
--startup-candle STARTUP_CANDLE [STARTUP_CANDLE ...]
--startup-candle STARTUP_CANDLE [STARTUP_CANDLE ...]
Provide a space-separated list of startup_candle_count to
be checked. Default : `199 399 499 999 1999`.
Common arguments:
-v, --verbose Verbose mode (-vv for more, -vvv to get all messages).
--logfile FILE Log to the file specified. Special values are:
'syslog', 'journald'. See the documentation for more
details.
-V, --version show program's version number and exit
-c PATH, --config PATH
Specify configuration file (default:
`userdir/config.json` or `config.json` whichever
exists). Multiple --config options may be used. Can be
set to `-` to read config from stdin.
-d PATH, --datadir PATH
Path to directory with historical backtesting data.
--userdir PATH, --user-data-dir PATH
Path to userdata directory.
Strategy arguments:
-s NAME, --strategy NAME
Specify strategy class name which will be used by the
bot.
--strategy-path PATH Specify additional strategy lookup path.
--timerange TIMERANGE
Specify what timerange of data to use.
```

### Why are odd-numbered default startup candles used?
Expand Down
6 changes: 3 additions & 3 deletions docs/requirements-docs.txt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
markdown==3.4.4
markdown==3.5
mkdocs==1.5.3
mkdocs-material==9.4.1
mkdocs-material==9.4.6
mdx_truly_sane_lists==1.3
pymdown-extensions==10.3
pymdown-extensions==10.3.1
jinja2==3.1.2
4 changes: 4 additions & 0 deletions docs/strategy-customization.md
Original file line number Diff line number Diff line change
Expand Up @@ -1008,6 +1008,10 @@ The following lists some common patterns which should be avoided to prevent frus
- don't use `dataframe['volume'].mean()`. This uses the full DataFrame for backtesting, including data from the future. Use `dataframe['volume'].rolling(<window>).mean()` instead
- don't use `.resample('1h')`. This uses the left border of the interval, so moves data from an hour to the start of the hour. Use `.resample('1h', label='right')` instead.

!!! Tip "Identifying problems"
You may also want to check the 2 helper commands [lookahead-analysis](lookahead-analysis.md) and [recursive-analysis](recursive-analysis.md), which can each help you figure out problems with your strategy in different ways.
Please treat them as what they are - helpers to identify most common problems. A negative result of each does not guarantee that there's none of the above errors included.

### Colliding signals

When conflicting signals collide (e.g. both `'enter_long'` and `'exit_long'` are 1), freqtrade will do nothing and ignore the entry signal. This will avoid trades that enter, and exit immediately. Obviously, this can potentially lead to missed entries.
Expand Down
2 changes: 1 addition & 1 deletion freqtrade/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
""" Freqtrade bot """
__version__ = '2023.9'
__version__ = '2023.10'

if 'dev' in __version__:
from pathlib import Path
Expand Down
4 changes: 2 additions & 2 deletions freqtrade/commands/arguments.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,8 +65,8 @@

ARGS_BUILD_STRATEGY = ["user_data_dir", "strategy", "template"]

ARGS_CONVERT_DATA_TRADES = ["pairs", "format_from_trades", "format_to", "erase", "exchange"]
ARGS_CONVERT_DATA = ["pairs", "format_from", "format_to", "erase", "exchange"]

ARGS_CONVERT_DATA_OHLCV = ARGS_CONVERT_DATA + ["timeframes", "trading_mode", "candle_types"]

ARGS_CONVERT_TRADES = ["pairs", "timeframes", "exchange", "dataformat_ohlcv", "dataformat_trades"]
Expand Down Expand Up @@ -268,7 +268,7 @@ def _build_subcommands(self) -> None:
parents=[_common_parser],
)
convert_trade_data_cmd.set_defaults(func=partial(start_convert_data, ohlcv=False))
self._build_args(optionlist=ARGS_CONVERT_DATA, parser=convert_trade_data_cmd)
self._build_args(optionlist=ARGS_CONVERT_DATA_TRADES, parser=convert_trade_data_cmd)

# Add trades-to-ohlcv subcommand
convert_trade_data_cmd = subparsers.add_parser(
Expand Down
6 changes: 6 additions & 0 deletions freqtrade/commands/cli_options.py
Original file line number Diff line number Diff line change
Expand Up @@ -421,6 +421,12 @@ def __init__(self, *args, **kwargs):
'desired timeframe as specified as --timeframes/-t.',
action='store_true',
),
"format_from_trades": Arg(
'--format-from',
help='Source format for data conversion.',
choices=constants.AVAILABLE_DATAHANDLERS + ['kraken_csv'],
required=True,
),
"format_from": Arg(
'--format-from',
help='Source format for data conversion.',
Expand Down
2 changes: 1 addition & 1 deletion freqtrade/commands/data_commands.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ def start_convert_data(args: Dict[str, Any], ohlcv: bool = True) -> None:
erase=args['erase'])
else:
convert_trades_format(config,
convert_from=args['format_from'], convert_to=args['format_to'],
convert_from=args['format_from_trades'], convert_to=args['format_to'],
erase=args['erase'])


Expand Down
4 changes: 2 additions & 2 deletions freqtrade/commands/optimize_commands.py
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ def start_lookahead_analysis(args: Dict[str, Any]) -> None:
:param args: Cli args from Arguments()
:return: None
"""
from freqtrade.optimize.lookahead_analysis_helpers import LookaheadAnalysisSubFunctions
from freqtrade.optimize.analysis.lookahead_helpers import LookaheadAnalysisSubFunctions

config = setup_utils_configuration(args, RunMode.UTIL_NO_EXCHANGE)
LookaheadAnalysisSubFunctions.start(config)
Expand All @@ -152,7 +152,7 @@ def start_recursive_analysis(args: Dict[str, Any]) -> None:
:param args: Cli args from Arguments()
:return: None
"""
from freqtrade.optimize.recursive_analysis_helpers import RecursiveAnalysisSubFunctions
from freqtrade.optimize.analysis.recursive_helpers import RecursiveAnalysisSubFunctions

config = setup_utils_configuration(args, RunMode.UTIL_NO_EXCHANGE)
RecursiveAnalysisSubFunctions.start(config)
11 changes: 11 additions & 0 deletions freqtrade/data/converter/trade_converter.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@
from freqtrade.constants import (DEFAULT_DATAFRAME_COLUMNS, DEFAULT_TRADES_COLUMNS, TRADES_DTYPES,
Config, TradeList)
from freqtrade.enums import CandleType
from freqtrade.exceptions import OperationalException


logger = logging.getLogger(__name__)
Expand Down Expand Up @@ -127,6 +128,16 @@ def convert_trades_format(config: Config, convert_from: str, convert_to: str, er
:param convert_to: Target format
:param erase: Erase source data (does not apply if source and target format are identical)
"""
if convert_from == 'kraken_csv':
if config['exchange']['name'] != 'kraken':
raise OperationalException(
'Converting from csv is only supported for kraken.'
'Please refer to the documentation for details about this special mode.'
)
from freqtrade.data.converter.trade_converter_kraken import import_kraken_trades_from_csv
import_kraken_trades_from_csv(config, convert_to)
return

from freqtrade.data.history.idatahandler import get_datahandler
src = get_datahandler(config['datadir'], convert_from)
trg = get_datahandler(config['datadir'], convert_to)
Expand Down
70 changes: 70 additions & 0 deletions freqtrade/data/converter/trade_converter_kraken.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
import logging
from pathlib import Path

import pandas as pd

from freqtrade.constants import DATETIME_PRINT_FORMAT, DEFAULT_TRADES_COLUMNS, Config
from freqtrade.data.converter.trade_converter import (trades_convert_types,
trades_df_remove_duplicates)
from freqtrade.data.history.idatahandler import get_datahandler
from freqtrade.exceptions import OperationalException
from freqtrade.resolvers import ExchangeResolver


logger = logging.getLogger(__name__)

KRAKEN_CSV_TRADE_COLUMNS = ['timestamp', 'price', 'amount']


def import_kraken_trades_from_csv(config: Config, convert_to: str):
"""
Import kraken trades from csv
"""
if config['exchange']['name'] != 'kraken':
raise OperationalException('This function is only for the kraken exchange.')

datadir: Path = config['datadir']
data_handler = get_datahandler(datadir, data_format=convert_to)

tradesdir: Path = config['datadir'] / 'trades_csv'
exchange = ExchangeResolver.load_exchange(config, validate=False)
# iterate through directories in this directory
data_symbols = {p.stem for p in tradesdir.rglob('*.csv')}

# create pair/filename mapping
markets = {
(m['symbol'], m['altname']) for m in exchange.markets.values()
if m.get('altname') in data_symbols
}
logger.info(f"Found csv files for {', '.join(data_symbols)}.")

for pair, name in markets:
dfs = []
# Load and combine all csv files for this pair
for f in tradesdir.rglob(f"{name}.csv"):
df = pd.read_csv(f, names=KRAKEN_CSV_TRADE_COLUMNS)
dfs.append(df)

# Load existing trades data
if not dfs:
# edgecase, can only happen if the file was deleted between the above glob and here
logger.info(f"No data found for pair {pair}")
continue

trades = pd.concat(dfs, ignore_index=True)

trades.loc[:, 'timestamp'] = trades['timestamp'] * 1e3
trades.loc[:, 'cost'] = trades['price'] * trades['amount']
for col in DEFAULT_TRADES_COLUMNS:
if col not in trades.columns:
trades[col] = ''

trades = trades[DEFAULT_TRADES_COLUMNS]
trades = trades_convert_types(trades)

trades_df = trades_df_remove_duplicates(trades)
logger.info(f"{pair}: {len(trades_df)} trades, from "
f"{trades_df['date'].min():{DATETIME_PRINT_FORMAT}} to "
f"{trades_df['date'].max():{DATETIME_PRINT_FORMAT}}")

data_handler.trades_store(pair, trades_df)
8 changes: 6 additions & 2 deletions freqtrade/exchange/binance.py
Original file line number Diff line number Diff line change
Expand Up @@ -123,10 +123,14 @@ async def _async_get_historic_ohlcv(self, pair: str, timeframe: str,

def funding_fee_cutoff(self, open_date: datetime):
"""
Funding fees are only charged at full hours (usually every 4-8h).
Therefore a trade opening at 10:00:01 will not be charged a funding fee until the next hour.
On binance, this cutoff is 15s.
https://github.com/freqtrade/freqtrade/pull/5779#discussion_r740175931
:param open_date: The open date for a trade
:return: The cutoff open time for when a funding fee is charged
:return: True if the date falls on a full hour, False otherwise
"""
return open_date.minute > 0 or (open_date.minute == 0 and open_date.second > 15)
return open_date.minute == 0 and open_date.second < 15

def dry_run_liquidation_price(
self,
Expand Down

0 comments on commit f142abf

Please sign in to comment.