To make it run:
docker run -d -p 443:9443 --name portainer \
--restart=always \
-v /var/run/docker.sock:/var/run/docker.sock \
-v portainer_data:/data \
portainer/portainer-ce:2.11.1
To stop it (in case you want to upgrade):
docker stop portainer
docker rm portainer
- Clone this repo to
~/biggest-losers
- Install Python 3.9 (or higher) (not lower, we need zoneinfo module for timezones). Running
python3 --version
should returnPython 3.9.x
- Install nodemon (install nodejs, then
npm install -g nodemon
; may need to usesudo
) - Run the following:
mkdir -p ~/biggest-losers-data/cache
mkdir -p ~/biggest-losers-data/inputs
mkdir -p ~/biggest-losers-data/outputs
mkdir -p ~/biggest-losers-data/logs
- Reference
.env.sample
and set uppaper.env
folder. SetBROKER=none
. Create finnhub.io and polygon.io accounts, get API keys and paste them intopaper.env
. - Ask your best friend for a zip of their cache directory - it can take days to build that from scratch on Polygon Free tier (5 req / minute)
- Add VS Code extensions for VS Code. Use pep8 autoformatter.
pip3 install -r requirements.txt
to install python dependencies.
TODO: what to do about ~/biggest-loser-data/outputs
syncing with Google Drive?
Follow instructions here to generate a new token using the server: https://github.com/jamesfulford/td-token
Then, scp the token to the remote server so it can be refreshed as needed.
scp output/token.json solomon:~/td-cash-data/inputs/td-token/output/token.json
(There might be issues with 2 different refreshers using the same tokens, it seems refreshing might cause expiration)