iOS App File Layout
-- First layer
SemanticSegmentation-CoreML/
-- Second layer
Main.storyboard
AppDelegate.swift
LiveImageViewController.swift
LiveMetalCameraViewController.swift
StillImageViewController.swift
DrawingSegmentationView.swift
LiveFaceDetectionAndFaceParsingViewController.swift
SegmentationResultMLMultiArray.swift
Measure.swift
VideoCapture.swift
MetalCamera/
-- Third layer
MetalRenderingDevice.swift
MetalVideoView.swift
CameraTextureGenerater.swift
SegmentationTextureGenerater.swift
MultitargetSegmentationTextureGenerater.swift
OverlayingTexturesGenerater.swift
CroppedTextureGenerater.swift
MaskTextureGenerater.swift
Texture.swift
Shaders/
-- Fourth layer
Shaders.metal
Utils/
-- Fourth layer
Maths.swift
mlmodel/
-- Fourth layer
DeepLabV3.mlmodel
DeepLabV3FP16.mlmodel
DeepLabV3Int8LUT.mlmodel
FaceParsing.mlmodel
CoreMl Models
Segmentation Data:
Segmentation shaders take buffer data that indicates classification results (like SegmentationValue) and use these to apply colors conditionally.
Layered Rendering:
Different shaders (like segmentation_render_target and multitarget_segmentation_render_target) suggest that the application might handle layered rendering where each layer represents different data or visual aspects. These could be composited together to form the final image, each layer possibly using different shaders for specialized rendering based on the layer's purpose.
Reference
GitHub - likedan/Awesome-CoreML-Models: Largest list of models for Core ML (for iOS 11+)
https://github.com/juanmorillios/List-CoreML-Models
GitHub - satoshi0212/MSLExamples: Metal Shader Language(MSL) examples.