Project: Lip Colour Finder

Overview

Lip Colour Finder accepts an uploaded image of a face and returns lipstick recommendations based on the colour of the lips found on that face.

Processing

Once an image is uploaded to the server, pre-trained models locate faces and their facial features which are then used to hone in on areas of interest of the face.

After detecting the location of the face’s skin using the method defined here, estimates of the face’s skin colour are created using a histogram.

The face skin colour estimate is then subtracted from a crop of the image containing only the face’s lips. This process produces a group of pixels from which we can use a histogram to extract an estimate for the lip colour.

The estimated lip colour is converted to the L*a*b* colour space and is compared against a database of lipstick colours that have also been converted to the L*a*b* colour space. Lipsticks with the smallest colour distance from our estimate are returned to the user.

Internals

Lip Colour Finder is built on the following technologies:

Web backend: Python | Django

Image processing: Python | dlib | scikit-image | Numpy

Web frontend: JavaScript | React

Web server: Ubuntu | NGINX | Gunicorn

Privacy

Images uploaded to the service are not viewed by a human and are deleted as soon as the analysis is complete.

 

Photo in UI Screenshots by Mathias Huysmans on Unsplash

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s