Gestural Video editing

in Facebook Camera


For years, people primarily shared text and photo posts and that's where Facebook invested its efforts. However, Mark believes video is the format of the future. After joining the Sharing team, I was tasked to improve the video editing and sharing experience. 


My Role

I was the sole designer of a team with a product manager and several engineers.

I designed the entire experience, identified entry points, coordinated user research, iterated based on feedback, and created specs for implementation for both iOS and Android.


The Problem

One of the features I redesigned was the video trimmer. The feature is highly impactful to both matured and emerging markets.

People in matured markets edit their videos to tell a concise story whereas people in emerging markets edit their videos to save on upload data.


After using with other video trimming tools, I found that:

  1. The small thumbnail area is very difficult for the finger to target and too close to the bottom edge of the screen

  2. Thumbnails should help people determine where to trim their video but the frames are too small and indistinguishable



I explored a variety of designs. After sharing it with the team, we've decided to focus on the design that utilizes full screen gestural interaction.




User Experience

Gestural interaction and direct manipulation on the video itself within Camera is quickly becoming the norm with AR and doodling. As a result, I explored a design where people can interact directly on the video itself.


Swiping on the left side of the screen towards the right trims the video from the beginning and swiping on the right side of the screen towards the left trims the video from the end.

To achieve this, there's an imaginary line down the middle of the screen and when the user swipes, the UI detects which half the swipe is from. The center line shifts dynamically as the video is being trimmed.

I spent a lot of time prototyping to get this right. Origami was my tool of choice and I was able to prototype the exact behavior. I loaded the prototype onto my phone and did some quick guerrilla research with the team and desk neighbors

Screen Shot 2018-09-10 at 3.40.58 PM.png

Since this is a novel interaction, I designed a NUX to illustrate the interaction.

Additionally, as a fallback, people can also use the timeline to trim if they missed the gestural interaction.