DepthPro CoreML Models
DepthPro is a monocular depth estimation model. This means that it is trained to predict depth on a single image.
Model Variants
Model Inputs and Outputs
DepthPro Normalized Inverse Depth Models
Inputs
pixel_values
: 1536x1536 3 color image.
Outputs
normalized_inverse_depth
1536x1536 monochrome image.
DepthPro Models
Inputs
pixel_values
: 1536x1536 3 color image.original_widths
: 1x1x1x1 Tensor containing the original width of the image before resizing.
Outputs
depth_meters
: 1x1x1536x1536 Tensor containing depth in meters.
Download
Install huggingface-cli
brew install huggingface-cli
To download one of the .mlpackage
folders to the models
directory:
huggingface-cli download \
--local-dir models --local-dir-use-symlinks False \
KeighBee/coreml-DepthPro \
--include "DepthProNormalizedInverseDepth-pruned10-Qlinear.mlpackage/*" "DepthPro-pruned10-Qlinear.mlpackage/*"
To download everything, skip the --include
argument.
Integrate in Swift apps
The huggingface/coreml-examples
repository contains sample Swift code for DepthProNormalizedInverseDepth-pruned10-Qlinear.mlpackage
and other models. See the instructions there to build the demo app, which shows how to use the model in your own Swift apps.
- Downloads last month
- 14
Inference API (serverless) does not yet support coreml models for this pipeline type.
Model tree for KeighBee/coreml-DepthPro
Base model
apple/DepthPro