Skip to content

feat: add Motion Photo properties to the photo facet#36

Open
dschmidt wants to merge 2 commits intomainfrom
feat/motion-photo
Open

feat: add Motion Photo properties to the photo facet#36
dschmidt wants to merge 2 commits intomainfrom
feat/motion-photo

Conversation

@dschmidt
Copy link
Copy Markdown

@dschmidt dschmidt commented Apr 11, 2026

Description

What are Motion Photos?

Motion Photos are image files (JPEG, HEIC, or AVIF) with a short video clip (typically 1.5-3 seconds) appended to the end of the file. When you take a photo on a Google Pixel or Samsung Galaxy device, by default the camera captures a brief video around the moment of the shutter press and embeds it directly into the image file. The result is a single file that works as a normal photo everywhere, but can also play back the embedded video in apps that support it - similar to Apple's Live Photos, but as a single file rather than a separate image/video pair.

The format is specified by Google as the Motion Photo format v1.0. It uses XMP metadata in the Camera namespace (http://ns.google.com/photos/1.0/camera/) to signal that a file is a Motion Photo and to describe how the still image and video relate to each other.

Samsung devices have converged on the same XMP metadata format, making this the de facto standard across the majority of Android devices.

What this PR adds

Three new read-only properties on the photo schema, following the @libre.graph.* extension pattern for properties not present in the MS Graph API:

Property Type Description
@libre.graph.motionPhoto integer (int32) 1 if the file is a Motion Photo, 0 or absent if not. Maps to Camera:MotionPhoto in the spec.
@libre.graph.motionPhotoVersion integer (int32) Format version of the Motion Photo spec (currently always 1). Maps to Camera:MotionPhotoVersion.
@libre.graph.motionPhotoPresentationTimestampUs integer (int64) Presentation timestamp (in microseconds) of the video frame corresponding to the still image. -1 means unspecified. Maps to Camera:MotionPhotoPresentationTimestampUs.

Example response

{
  "photo": {
    "cameraMake": "Google",
    "cameraModel": "Pixel 8 Pro",
    "iso": 58,
    "takenDateTime": "2024-03-15T14:22:07Z",
    "@libre.graph.motionPhoto": 1,
    "@libre.graph.motionPhotoVersion": 1,
    "@libre.graph.motionPhotoPresentationTimestampUs": 1245580
  }
}

Implementation plan

The overall goal is to let the OpenCloud web frontend detect Motion Photos and play back the embedded video when a user opens one.

Backend (opencloud repo)

Search service / Tika extractor (services/search/pkg/content/tika.go):

  • Tika already parses XMP metadata from JPEGs. The Camera:MotionPhoto, Camera:MotionPhotoVersion, and Camera:MotionPhotoPresentationTimestampUs fields should be available in the Tika metadata map.
  • Extend the existing getPhoto() function to also read these fields and set them on the Photo struct.
  • Also handle the legacy GCamera:MicroVideo format (Pixel 2/3 era) and Samsung's EmbeddedVideoType: MotionPhoto_Data by normalizing them to motionPhoto = 1.
  • marshalToStringMap needs to strip the @libre.graph. prefix from JSON tag names before building the storage key, so that @libre.graph.motionPhoto is stored as libre.graph.photo.motionPhoto, not libre.graph.photo.@libre.graph.motionPhoto.

Graph service (services/graph/pkg/service/v0/driveitems.go):

  • unmarshalStringMap needs the same prefix-stripping when looking up keys in the metadata map. Once that is in place and the libre-graph-api-go model is regenerated, the new fields flow through automatically via the existing reflection-based marshalling.

Reva (internal/http/services/owncloud/ocdav/propfind/propfind.go):

  • Add motionPhoto, motionPhotoVersion, and motionPhotoPresentationTimestampUs to the photoKeys slice.
  • appendMetadataProp already builds storage keys from the key list, so no further changes needed there beyond the new entries.

Frontend (web repo)

Preview app (packages/web-app-preview/):

  • When the preview app opens an image, check if photo['@libre.graph.motionPhoto'] === 1.
  • If so, show a visual indicator (e.g. a "Motion" badge or play icon).
  • On user interaction (hover, long-press, or play button), fetch the original file (not the thumbnail), extract the embedded MP4 by scanning for the ftyp signature in the bytes, and play it in a <video> element.

Out of scope for now

  • Apple Live Photos: These consist of two separate files (HEIC + MOV) linked by a ContentIdentifier. Supporting them requires a fundamentally different approach (file relationship management) and is tracked separately.
  • Video metadata extraction: The embedded video's dimensions, duration, and codec are not available from the XMP metadata. Extracting them would require parsing the MP4 container separately. This information is not essential - the frontend learns it when it plays the video.
  • Backend video extraction endpoint: A dedicated endpoint to stream just the video portion of a Motion Photo would avoid loading the full file on the client. This is a potential future optimization.

References

I'm willing to do the implementation

Added properties for Motion Photo metadata including version and presentation timestamp.
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR extends the photo facet schema in the OpenAPI spec to describe Motion Photo metadata using the existing @libre.graph.* extension pattern.

Changes:

  • Add @libre.graph.motionPhoto (int32) to indicate whether a file is a Motion Photo.
  • Add @libre.graph.motionPhotoVersion (int32) and @libre.graph.motionPhotoPresentationTimestampUs (int64) as read-only Motion Photo metadata fields.
  • Document semantics and mapping to Google’s Motion Photo XMP fields in the schema descriptions.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@dschmidt
Copy link
Copy Markdown
Author

dschmidt commented Apr 12, 2026

I'm not sure about the prefix - how strict are we about adding those @libre.graph prefixes?
It will be quite some effort to handle them in all the mappers for very little practical use... and it makes accessing the value awkward in TypeScript:
resource.photo?.['@libre.graph.motionPhoto'] instead of just resource.photo?.motionPhoto

But if you insist it's doable of course

Added motionPhotoVideoSize property to the OpenAPI spec.
@dschmidt
Copy link
Copy Markdown
Author

dschmidt commented Apr 12, 2026

I've added @libre.graph.motionPhotoVideoSize so clients can retrieve just the video without downloading the full picture using a range request - as the video is guaranteed to be at the end of the file (so the start is item.size - item.photo.motionPhotoVideoSize`)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants