Android Tips: Faces Detection with Mobile Vision API - Google Play services

    When we use Social Network app like Facebook or Google+, they always detect and recognize human faces in each image uploaded. Google provide to Android developers a huge proprietary background service and API package called Google Play Services, which allow us to use or integrate Google's services (Youtube, Map, Google+,..) in Android app.
    With the release of Google Play services 7.8, we announced the addition of new Mobile Vision APIs, which includes a new Face API that finds human faces in images and video better and faster than before. This API is also smarter at distinguishing faces at different orientations and with different facial features facial expressions.

Face Detection API

    Like it's docs say, we must attend this note:
This is not a face recognition API. Instead, the new API simply detects areas in the image or video that are human faces. It also infers from changes in the position frame to frame that faces in consecutive frames of video are the same face. If a face leaves the field of view, and re-enters, it isn’t recognized as a previously detected face.
    Face Detection is a leap forward from the previous Android FaceDetector.Face API. It’s designed to better detect human faces in images and video for easier editing. It’s smart enough to detect faces even at different orientations -- so if your subject’s head is turned sideways, it can detect it. Specific landmarks can also be detected on faces, such as the eyes, the nose, and the edges of the lips.

Starting project

    After creating a new project, in order to use Google Play Services API, you must add dependency to gradle build file. So, your app/build.gradle dependencies entry will be like this:
dependencies {
    compile fileTree(dir: 'libs', include: ['*.jar'])
    compile 'com.android.support:appcompat-v7:22.2.1'
    compile 'com.google.android.gms:play-services:7.8.0' 
}
   Create a simple layout for activity to run, only iclude 2 buttons (Choose image from Gallery and Detect Faces):
This screen after running app:

Programmatically Code for Activity

    We will select an Image after click "Choose Image From Gallery" button. Filter files, choose only image file format by this Intent method:
intent.setType("image/*"); // filter only image type files
    Get Bitmap Image and set it to ImageView in activity onActivityResult() method:
private View.OnClickListener onLoadImageListener() {
        return new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                Intent intent = new Intent();
                intent.setType("image/*"); // filter only image type files
                intent.setAction(Intent.ACTION_GET_CONTENT);
                intent.addCategory(Intent.CATEGORY_OPENABLE);
                startActivityForResult(intent, REQUET_LOADIMAGE);
            }
        };
    }

    @Override
    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        super.onActivityResult(requestCode, resultCode, data);
        if (requestCode == REQUET_LOADIMAGE && resultCode == RESULT_OK) {

            if (bitmap != null) {
                bitmap.recycle();
            }
            try {
                InputStream inputStream = getContentResolver().openInputStream(data.getData());
                bitmap = BitmapFactory.decodeStream(inputStream);
                inputStream.close();
                image.setImageBitmap(bitmap);

            } catch (FileNotFoundException e) {
                e.printStackTrace();
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
    }
    When "Detect Faces" Button clicked, app will detect all faces in current image. Initialize a FaceDetector object and launch it, After this process, we will get an SparseArray of Face object:
 //Detect the Faces
        FaceDetector faceDetector = new FaceDetector.Builder(getApplicationContext())
                .setTrackingEnabled(false)
                .build();

        Frame frame = new Frame.Builder().setBitmap(bitmap).build();
        SparseArray<Face> faces = faceDetector.detect(frame);
    After faces detected, draw a red rectangle surrounded every face, create a new Bitmap from this process and set it to ImageView:
if (faces.size() == 0) {
            Toast.makeText(this, "None face detected!", Toast.LENGTH_SHORT).show();
        } else {
            //Draw Rectangles on the Faces
            for (int i = 0; i < faces.size(); i++) {
                Face thisFace = faces.valueAt(i);
                float x1 = thisFace.getPosition().x;
                float y1 = thisFace.getPosition().y;
                float x2 = x1 + thisFace.getWidth();
                float y2 = y1 + thisFace.getHeight();
                tempCanvas.drawRoundRect(new RectF(x1, y1, x2, y2), 2, 2, myRectPaint);
            }
            image.setImageDrawable(new BitmapDrawable(getResources(), tempBitmap));
Now, we have detectFacesInImage() method, was invoke when Button clicked:
 private View.OnClickListener onDetectFaceListener() {
        return new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                if (bitmap == null) {
                    Toast.makeText(MainActivity.this, "Please choose image first!", Toast.LENGTH_LONG).show();
                } else {
                    detectFacesInImage();
                }
            }
        };
    }

    public void detectFacesInImage() {

        //Create a Paint object for drawing with
        Paint myRectPaint = new Paint();
        myRectPaint.setStrokeWidth(5);
        myRectPaint.setColor(Color.RED);
        myRectPaint.setStyle(Paint.Style.STROKE);

        //Create a Canvas object for drawing on
        Bitmap tempBitmap = Bitmap.createBitmap(bitmap.getWidth(), bitmap.getHeight(), Bitmap.Config.RGB_565);
        Canvas tempCanvas = new Canvas(tempBitmap);
        tempCanvas.drawBitmap(bitmap, 0, 0, null);

        //Detect the Faces
        FaceDetector faceDetector = new FaceDetector.Builder(getApplicationContext())
                .setTrackingEnabled(false)
                .build();

        Frame frame = new Frame.Builder().setBitmap(bitmap).build();
        SparseArray<face> faces = faceDetector.detect(frame);

        if (faces.size() == 0) {
            Toast.makeText(this, "None face detected!", Toast.LENGTH_SHORT).show();
        } else {
            //Draw Rectangles on the Faces
            for (int i = 0; i < faces.size(); i++) {
                Face thisFace = faces.valueAt(i);
                float x1 = thisFace.getPosition().x;
                float y1 = thisFace.getPosition().y;
                float x2 = x1 + thisFace.getWidth();
                float y2 = y1 + thisFace.getHeight();
                tempCanvas.drawRoundRect(new RectF(x1, y1, x2, y2), 2, 2, myRectPaint);
            }
            image.setImageDrawable(new BitmapDrawable(getResources(), tempBitmap));
            Toast.makeText(MainActivity.this, "Done", Toast.LENGTH_LONG).show();
        }
    }

Final code

   Full code for main activity, the most important file:
package devexchanges.info.facedetection;

import android.graphics.Paint;

import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.RectF;
import android.graphics.drawable.BitmapDrawable;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.SparseArray;
import android.view.View;
import android.widget.ImageView;
import android.widget.Toast;

import com.google.android.gms.vision.Frame;
import com.google.android.gms.vision.face.Face;
import com.google.android.gms.vision.face.FaceDetector;

import java.io.FileNotFoundException;
import java.io.IOException;
import java.io.InputStream;

public class MainActivity extends AppCompatActivity {

    private static final int REQUET_LOADIMAGE = 111;
    private View btnChooseImage, btnDetect;
    private ImageView image;
    private Bitmap bitmap;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        btnChooseImage = findViewById(R.id.btn_choose);
        btnDetect = findViewById(R.id.btn_detect_face);
        image = (ImageView) findViewById(R.id.image);

        btnChooseImage.setOnClickListener(onLoadImageListener());

        btnDetect.setOnClickListener(onDetectFaceListener());
    }

    private View.OnClickListener onDetectFaceListener() {
        return new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                if (bitmap == null) {
                    Toast.makeText(MainActivity.this, "Please choose image first!", Toast.LENGTH_LONG).show();
                } else {
                    detectFacesInImage();
                }
            }
        };
    }

    private View.OnClickListener onLoadImageListener() {
        return new View.OnClickListener() {
            @Override
            public void onClick(View v) {
                Intent intent = new Intent();
                intent.setType("image/*"); // filter only image type files
                intent.setAction(Intent.ACTION_GET_CONTENT);
                intent.addCategory(Intent.CATEGORY_OPENABLE);
                startActivityForResult(intent, REQUET_LOADIMAGE);
            }
        };
    }

    @Override
    protected void onActivityResult(int requestCode, int resultCode, Intent data) {
        super.onActivityResult(requestCode, resultCode, data);
        if (requestCode == REQUET_LOADIMAGE && resultCode == RESULT_OK) {

            if (bitmap != null) {
                bitmap.recycle();
            }
            try {
                InputStream inputStream = getContentResolver().openInputStream(data.getData());
                bitmap = BitmapFactory.decodeStream(inputStream);
                inputStream.close();
                image.setImageBitmap(bitmap);

            } catch (FileNotFoundException e) {
                e.printStackTrace();
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
    }

    public void detectFacesInImage() {

        //Create a Paint object for drawing with
        Paint myRectPaint = new Paint();
        myRectPaint.setStrokeWidth(5);
        myRectPaint.setColor(Color.RED);
        myRectPaint.setStyle(Paint.Style.STROKE);

        //Create a Canvas object for drawing on
        Bitmap tempBitmap = Bitmap.createBitmap(bitmap.getWidth(), bitmap.getHeight(), Bitmap.Config.RGB_565);
        Canvas tempCanvas = new Canvas(tempBitmap);
        tempCanvas.drawBitmap(bitmap, 0, 0, null);

        //Detect the Faces
        FaceDetector faceDetector = new FaceDetector.Builder(getApplicationContext())
                .setTrackingEnabled(false)
                .build();

        Frame frame = new Frame.Builder().setBitmap(bitmap).build();
        SparseArray<Face> faces = faceDetector.detect(frame);

        if (faces.size() == 0) {
            Toast.makeText(this, "None face detected!", Toast.LENGTH_SHORT).show();
        } else {
            //Draw Rectangles on the Faces
            for (int i = 0; i < faces.size(); i++) {
                Face thisFace = faces.valueAt(i);
                float x1 = thisFace.getPosition().x;
                float y1 = thisFace.getPosition().y;
                float x2 = x1 + thisFace.getWidth();
                float y2 = y1 + thisFace.getHeight();
                tempCanvas.drawRoundRect(new RectF(x1, y1, x2, y2), 2, 2, myRectPaint);
            }
            image.setImageDrawable(new BitmapDrawable(getResources(), tempBitmap));
            Toast.makeText(MainActivity.this, "Done", Toast.LENGTH_LONG).show();
        }
    }
}
    Strings resources:

Running program

    I install this app in Asus Zenfone 4, and this is our result:

References

    You can go to these official sites for more information:
    Read more about Mobile Vision API:


Share


Previous post
« Prev Post
Next post
Next Post »