GithubHelp home page GithubHelp logo

horizon_detection's Introduction

Horizon Detection

Horizon detection or sky segmentation is the problem of finding a boundary between sky and non sky regions in a given image. This can have many applications especially in navigation of UAV. This problem has been tackled by many and there are mainly two ways of solving this:

  • Edge detection
  • Modeling sky and non sky regions using machine learning

In most of the early attempts of this problem, there is an underlying assumption that the horizon boundary is linear. This post also discusses one such method by Ettinger which uses latter of the two ways discussed above.

The basic algorithm in that the sky and ground are modeled as two different gaussian distributions in RGB space, and then horizon line is a line segment separating the two, which can found by maximizing an optimization criterion. Thus sky and ground regions are represented as two set of points each distributed about a separate mean point in the RGB space. We then perform a search through potential set of lines (m,b), to find the line with highest likelihood of being the best fit horizon line. Now we just need to find the scalar term for the optimization criterion. Intuitively, given the pixel groupings, we need to quantify the assumption that a sky pixel will look similar to other sky pixels and likewise for the ground ones. Thus we are definitely looking for a degree of variance in each distribution. Now we want to obtain a single measure of variance from a three dimensional data. We know that the three eignevalues of the covariance matrix represent the degree of variance from the mean along the three principal axes, thus a product of these eigenvalues is a good scalar value. Since product of eigenvalues is the determinant of the matrix, we can say we need to minimize the following function,

or rather maximize the following function:

where,

The above function needs some adjustments in case the covariance matrix is singular, but this function performs pretty well as can be seen in the images below.

However in some images where the separation in RGB space between sky and ground is not so distinct or there is some color variation in either sky or ground (too bright sun in one quarter), one can see distorted horizon line. Some images where this doesn't work as well are below.

References:

  • Vision-Guided Flight Stability and Control for Micro Air Vehicles, Scott M. Ettinger

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.