Nutrition Box is an iOS App that helps people identify the food around them through photos using Machine Learning.
Users can choose the source of their photos, namely through photo albums or take pictures directly. After the photo is selected or taken, the photo will be processed for detection using machine learning to tell the name of the food from the processed photo. After that, the user can find out the content of the food such as the total weight (in one unit), the number of calories, and the nutritional content in the food. Users can save the results of identification of food nutrients which will be stored in a report using a table.
View | View |
---|---|
Coming Soon | Coming Soon |
- UIKit
- CoreML / Vision
- Native iOS Networking
- CoreData
- Inception V3 (https://developer.apple.com/machine-learning/models/) - CoreML Model to Identify the Photo Object
- Edamam API (https://developer.edamam.com/edamam-docs-nutrition-api) - Nutritions Analysis API to analyze nutrition of the given food (parameters)
- Build for iOS 14.5 and above
- Build with XCode 12
- Using Swift 5
This project is my personal work under the educational process of the Udacity Nanodegree Program.