✨
BlobStar
iOS application to automagically control the device torch/flash and capture photos.
The software was quickly drafted to monitor some Physarum Polycephalum evolution during the Mission Alpha's experiment #EleveTonBlob. This custom project was crafted by a volunteer parent to help a primary school in French Jura, not so far away from where the first Comté
Photos are saved in the device Photo Library and are meant to be aggregated in a movie file later. Consult the post-processing section for examples.
The name is a word play on "blob", the other name The French
Features
- Take photo every 1 to 60 minutes
- Control torch level and flash mode
- Turn on then off torch during capture
- Store geo-coordinates in photo EXIF data
- Save photo in Photo Library
- Persistent user settings
- Prevent device from sleeping when app is active
Preview
A picture is worth a thousand words, so here you have a Simulator screen shot:
🔒
Privacy The app needs to access some Privacy sensitive features, such as:
Privacy | Permission | Required | Usage |
---|---|---|---|
Camera | Access the Camera | |
Preview and capture photos |
Location | Allow While Using App | |
Store device location in photo EXIF data |
Photo Library | Allow Access to All Photos | |
Store photo in the device library |
Location and Camera permissions are requested at the app first launch.
Photo Library permission is requested during the first photo capture. Press the Camera
The app raises a fatalError()
when it can't access the camera poor design because of rushed deadlines, and Privacy options can later be fixed in the BlobStar preferences panel of the Settings app.
Build
The app was developed with Xcode version 13 and tested on recent devices running iOS version 15.
It is trivial to build and run the app on any device, as long as you deal with the Code Signing requirements. That's it, anyone with a Mac and registered as a free Developer can install Xcode to build and install iOS apps, with just a few limitations such as the provisioning profile expiration period.
The project source code is provided as is and it's not ok to contact me for support (this file covers pretty much everything you need) or feature requests beyond the scope of the mission. Responsible pull requests and forks are welcome.
Post-processing
On macOS, you can export the photos from the device using either Photos or Image Capture app, then use the Open Image Sequence command in QuickTime Player app to create the video file.
Advanced users may prefer running:
- the scriptable image processing system command-line interface
sips
to convert HEIC images to PNG or TIFF - FFmpeg to create H.264 or ProRes sequential movie files
Example:
# Go to photo directory
cd /path/to/photo/dir
# Convert HEIC files to PNG
find . -name "*.HEIC" | sort | while read filename
do
sips -s format png "${filename}" -o "${filename%.*}.png"
done
# Create a 12 frames per second H.264 video file
ffmpeg -framerate 12 -pattern_type glob -i "*.png" -c:v libx264 -pix_fmt yuv420p output.mp4
# Create a 12 frames per second ProRes video file
ffmpeg -framerate 12 -pattern_type glob -i "*.png" -c:v prores -profile:v 3 -pix_fmt yuv422p10 output.mov
And here you have ffmpeg
options to eventually transpose images by 90 degrees clockwise: -vf "transpose=1"
Use Homebrew package manager to install FFmpeg and its dependencies
Acknowledgments
Many thanks to the wonderful pilots, doctors, engineers, etc. who turn the Space discovery dream into a reality. I am so glad you are sharing your research with all the kids down here
Also, I am using the mission logo for the app icon without permission, sorry