This project aims to provide support to doctor to monitor diabetes and prevent diabetic ketoacidosis.
Diabetes is a chronic disease that requires continuous and accurate monitoring of blood sugar levels. A potentially dangerous complication of diabetes, called ketoacidosis, occurs when the organism begins to produce excess ketones leading to an increase of the acidity level in the blood.
The project is based on an IoT Cloud architecture where each sensor (placed on each patient) collects information about blood pH and sends them on Cloud where they are processed through Serverless Computing and stored in a NoSQL database.
The sensors' functionality is inspired by the method described in the paper Bioresorbable Nanostructured Chemical Sensor for Monitoring of pH Level In Vivo. The sensor takes pH measurements from 4.0 to 7.45
.
Each sensor sends a message containing the following information:
- sensor ID;
- time in format yyyy-mm-dd hh:mm:ss;
- fiscal code of the patient;
- blood pH value.
Seven measurements are taken per day and the messages are sent on two queues according to the pH value. On the "Measurements" queue all measurements taken are sent, while on the "Warning" queue are sent messages (without specifying the sensor ID) for pH values less than 7.35.
Each message sent on the "Measurements" queue triggers a Serverless function that is responsible to insert the measurement into a NoSQL database. The "Measurements" table contain items with the following information:
- fiscal code of the patient;
- time in format yyyy-mm-dd hh:mm:ss;
- sensor ID;
- blood pH value.
At the end of the day, a time triggered Serverless function computes the average of the 7 daily measurements for each patient and saves the result in the "Averages" table. Each item in the table contains the following information:
- fiscal code of the patient;
- time in format yyyy-mm-dd hh:mm:ss;
- blood pH average;
- sensor ID;
- blood pH values.
This feature is important for comparing the daily average with the averages of previous days to check the effectiveness of medical treatment.
Each message sent on the "Warning" queue triggers a Serverless function that sends an email to the doctor notifying him of the warning.
This functionality is important for the rapid administration of large quantities of intravenous fluids in combination with electrolytes, such as sodium, potassium, chlorine and sometimes phosphates.
To the doctor will be provided a web application offering the following functionalities:
- Display of patient list and information such as age and type of diabetes;
- Average display for the current and previous day;
- Display of measurements taken on the current and previous day;
- Display of average and measurement history;
- Insertion of a new patient.
The web application consists of a user interface and a set of RESTful APIs defined using the Flask framework.
The APIs offer the possibility of:
-
obtain all patients in the "Patients" table. Each item in the table contains the following information:
- fiscal code;
- first name;
- surname;
- date of birth in "DD/MM/YYYY" format;
- type of diabetes (1 or 2);
- profile image name;
The image name is used to retrieve the patient image contained in the S3 "patientsimages" bucket.
-
obtain the average of a given patient on a given date;
-
obtain the measurements of a specific patient on a specific date;
-
save a new patient.
- The Cloud environment is simulated using LocalStack to replicate the AWS services.
- The IoT devices are simulated using a Python function exploiting boto3 to send messages on the queues.
- The queue are implemented using Amazon Simple Queue Service (SQS).
- The database is built using Amazon DynamoDB.
- The functions are Serveless functions deployed on AWS Lambda.
- The time-triggered function is implemented using Amazon EventBridge.
- The email is sent using IFTT.
- The DynamoDB GUI is available using dynamodb-admin.
- The patients' images are stored using Amazon S3.
- The APIs are built with Flask.
To install the prerequisites on MacOS, you can follow the instructions in the file Installation.pdf
git clone https://github.com/BenedettoSimone/KetoCare.git
2. Launch LocalStack
docker run --rm -it -p 4566:4566 -p 4571:4571 localstack/localstack
aws sqs create-queue --queue-name Measurements --endpoint-url=http://localhost:4566
aws sqs create-queue --queue-name Warning --endpoint-url=http://localhost:4566
aws s3 mb s3://patientsimages --endpoint-url=http://localhost:4566
To check that the queues have been correctly created use the following command.
aws sqs list-queues --endpoint-url=http://localhost:4566
- Use the Python code to create the DynamoDB tables.
cd KetoCare
python3 settings/createMeasurementsTable.py
python3 settings/createAveragesTable.py
python3 settings/createPatientsTable.py
- Check that the tables have been correctly created.
aws dynamodb list-tables --endpoint-url=http://localhost:4566
- Populate tables with initial data. The
loadData
script loads the measurements and the averages of the two days preceding the current day. TheloadPatients
script loads the data of five patients. In addition, the patient's profile photo will be loaded into the S3 bucket.
python3 settings/loadData.py
python3 settings/loadPatients.py
- Check that the tables have been correctly populated using the AWS CLI (Press q to exit)
aws dynamodb scan --table-name Measurements --endpoint-url=http://localhost:4566
aws dynamodb scan --table-name Averages --endpoint-url=http://localhost:4566
aws dynamodb scan --table-name Patients --endpoint-url=http://localhost:4566
or using the [dynamodb-admin] GUI with the command
DYNAMO_ENDPOINT=http://0.0.0.0:4566 dynamodb-admin
and then going to http://localhost:8001
.
- Create the role.
aws iam create-role --role-name lambdarole --assume-role-policy-document file://settings/role_policy.json --query 'Role.Arn' --endpoint-url=http://localhost:4566
- Attach the policy.
aws iam put-role-policy --role-name lambdarole --policy-name lambdapolicy --policy-document file://settings/policy.json --endpoint-url=http://localhost:4566
- Create the zip file.
zip saveMeasurements.zip settings/saveMeasurements.py
- Create the function.
aws lambda create-function --function-name saveMeasurements --zip-file fileb://saveMeasurements.zip --handler settings/saveMeasurements.lambda_handler --runtime python3.6 --role arn:aws:iam::000000000000:role/lambdarole --endpoint-url=http://localhost:4566
- Create the event source mapping between the function and the queue.
aws lambda create-event-source-mapping --function-name saveMeasurements --batch-size 5 --maximum-batching-window-in-seconds 60 --event-source-arn arn:aws:sqs:us-east-2:000000000000:Measurements --endpoint-url=http://localhost:4566
6. Set up the Lambda function triggered by SQS messages that notifies errors in IoT devices via email
- Create the IFTT Applet
- Go to https://ifttt.com/ and sign-up or log-in if you already have an account.
- On the main page, click Create to create a new applet.
- Click "If This", type "webhooks" in the search bar, and choose the Webhooks service.
- Select "Receive a web request" and write "email_warning" in the "Event Name" field. Save the event name since it is required to trigger the event. Click Create trigger.
- In the applet page click Then That, type "email" in the search bar, and select Email.
- Click Send me an email and fill the fields as follows:
-
Subject:
[KetoCare] Warning!
-
Body:
The sensor of patient <b>{{Value1}}</b> reported at <b>{{Value2}}</b> a blood pH value of <b>{{Value3}}</b> .
- Click Create action, Continue, and Finish.
-
Modify the variable
key
within thesettings/emailWarning.py
function with your IFTT applet key. The key can be find clicking on the icon of the webhook and clicking on Documentation. -
Zip the Python file and create the Lambda function.
zip emailWarning.zip settings/emailWarning.py
aws lambda create-function --function-name emailWarning --zip-file fileb://emailWarning.zip --handler settings/emailWarning.lambda_handler --runtime python3.6 --role arn:aws:iam::000000000000:role/lambdarole --endpoint-url=http://localhost:4566
- Create the event source mapping between the function and the queue.
aws lambda create-event-source-mapping --function-name emailWarning --batch-size 5 --maximum-batching-window-in-seconds 60 --event-source-arn arn:aws:sqs:us-east-2:000000000000:Warning --endpoint-url=http://localhost:4566
- Test the mapping sending a message on the error queue and check that an email is sent.
aws sqs send-message --queue-url http://localhost:4566/000000000000/Warning --message-body '{"fiscal_code": "FRRLNZ50M24F839C","measure_date": "2023-02-27 18:57:03", "measured_value": "7.02"}' --endpoint-url=http://localhost:4566
- Create the zip file.
zip computeAvg.zip settings/computeAvg.py
- Create the function and save the Arn (it should be
arn:aws:lambda:us-east-2:000000000000:function:computeAvg
).
aws lambda create-function --function-name computeAvg --zip-file fileb://computeAvg.zip --handler settings/computeAvg.lambda_handler --runtime python3.6 --role arn:aws:iam::000000000000:role/lambdarole --endpoint-url=http://localhost:4566
- Create the rule and save the Arn (it should be
arn:aws:events:us-east-2:000000000000:rule/calculateAvg
).
aws events put-rule --name calculateAvg --schedule-expression 'cron(0 23 * * ? *)' --endpoint-url=http://localhost:4566
- Check that the rule has been correctly created with the frequency wanted.
aws events list-rules --endpoint-url=http://localhost:4566
- Add permission to the rule created.
aws lambda add-permission --function-name computeAvg --statement-id calculateAvg --action 'lambda:InvokeFunction' --principal events.amazonaws.com --source-arn arn:aws:events:us-east-2:000000000000:rule/computeAvg --endpoint-url=http://localhost:4566
- Add the lambda function to the rule using the JSON file containing the Lambda function Arn.
aws events put-targets --rule calculateAvg --targets file://settings/targets.json --endpoint-url=http://localhost:4566
- First, run the server.
python3 app.py
- In your browser, open the
KetoCare/index.html
file. - Simulate the sensors and check all the daily measurements on the dashboard.
python3 sensors.py
- Wait that the average Lambda function compute the average or invoke it manually. Now you can see the result on the dashboard.
aws lambda invoke --function-name computeAvg out --endpoint-url=http://localhost:4566