let's talk about design :)

  • LinkedIn - Black Circle
  • Instagram - Black Circle

Copyright 2019 Rachel Pollock. All rights reserved.


mobile rock-collecting companion for the novice geologist

fall 2018 · 2 weeks · UI / Motion Design

how might I leverage the affordances of a traditional field guide and the affordances of a mobile phone to help someone classify a rock?

This 2-week exercise exercise in UI design asked me to translate a traditionally-analog tool into a mobile application.  I chose to design a field guide for classifying rock specimens.

Project Process

research & synthesis




analysis of existing tools

task analysis



task flow


visual design

icon design

hi-fi wireframes

clickable mockup

motion design


task analysis

Ah yes, the task analysis.  I’m getting excited just thinking about it - this is one of my favorite parts of any UI project.  


Anyway, the user’s primary task is to label each specimen he collects properly.  But how does one go about classifying a rock?  And how could a computer do the same thing?  Would the process be different?  I dug deep into the world of rock-classification (which surprisingly, I genuinely found quite interesting) and determined that an AI-powered, image-based rock-classifying tool seemed feasible.  Though the app would do the real classification work, I also wanted it to educate its users (in an easily-digestible way) about how it was doing the classification.  I made sure to work this idea into the app's navigational flow.


The user's primary task (and the one I would develop) is highlighted in black.

device affordance considerations

The phone screen’s modest size caused me to consider the importance of the information I presented on each page.  I decided that each screen would focus on one task until the specimen is classified.

human factors & context-of-use considerations

One-handed operation proved imperative to the app’s usability, as users might handle rocks or other materials in conjunction with their phones.  Therefore, I designed the app to operate with a few simple, single-handed swipes, taps, and flicks.

capture those specimens!

introducing Collector

From the field to the lab, Collector is your new favorite mobile rock-collecting companion!  Collector allows you to capture specimens and then scientifically classify them.  Never studied geology before?  No problem!  Collector's step-by-step classification sequence will have you feeling like a geologist in no time.


hi-fi wireframing

The following wireframes represent key actions in the task flow for capturing and saving a specimen.



Users utilize the phone's camera to capture a photo of the specimen



The app's algorithm analyzes and classifies the photo by comparing it to others in its database



Once the specimen is classified, its profile is added to the user's collection.  The profile details the specimen's location and physical properties.


UI motion design considerations

This stage of the project taught me a lot about how the UI’s motion design prompts users to anticipate the app’s hierarchy and functionality (including the gestures needed to complete their task). While I wireframed, I considered each UI element's contribution to the user's understanding of the app's navigation and functionality.  

hi-fi UI wireflow

I mapped out each UI element's motion design triggers and feedback within the specimen classification task flow's wireframes.  

hi-fi clickable prototype

I created the following clickable prototype in Principle, following the actions I'd mapped in the wireflow.

reflections & moving forward

With more time, I would love to test the clickable prototype with some real users and continue iterating.



If interested, users can read about each of the specimen's properties in greater detail.