-
-
Notifications
You must be signed in to change notification settings - Fork 115
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Restore from S3 bucket #253
Labels
Comments
I have created a similar script. The feature would be very much appreciated. export AWS_ACCESS_KEY_ID=$(cat ${DB01_S3_KEY_ID_FILE})
export AWS_SECRET_ACCESS_KEY=$(cat ${DB01_S3_KEY_SECRET_FILE})
export AWS_DEFAULT_REGION=${DB01_S3_REGION}
export SOURCE_FILE="mysql_.*\.gz$"
export BACKUP_LOCATION="/backup"
aws sts get-caller-identity
export LATEST_FILE=$(aws s3 ls s3://${DB01_S3_BUCKET}/${DB01_S3_PATH}/ | grep -E ${SOURCE_FILE} | sort -r | head -n 1 | awk '{print $4}')
aws s3 cp s3://${DB01_S3_BUCKET}/${DB01_S3_PATH}/${LATEST_FILE} ${BACKUP_LOCATION}/${TARGET_FILE}
aws s3 cp s3://${DB01_S3_BUCKET}/${DB01_S3_PATH}/${LATEST_FILE}.sha1 ${BACKUP_LOCATION}/
restore ${BACKUP_LOCATION}/${TARGET_FILE} ${DB01_TYPE} ${DB01_HOST} ${DB01_NAME} ${DB01_USER} $(cat ${DB01_PASS_FILE}) ${DB01_PORT} |
Updated for v4:
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Description of the feature
Add possibility to recover database backups stored in S3 bucket.
Benefits of feature
Restore from remote S3 location.
Additional context
Right now, use add. script:
It copies latest S3 ${SOURCE_FILE}=
mysql_.*\.gz$
file from ${S3_BUCKET}/${S3_PATH} and downloads to ${TEMP_LOCATION}/${TARGET_FILE}, then executes restore cmdaws cp
can be used with ${LATEST_FILE} (+ ${SOURCE_FILE} grep), or, ${SOURCE_FILE}.The text was updated successfully, but these errors were encountered: