Saturday, September 25, 2010

Air pollution MONITOR with MOBILE PHONE


http://www.wired.com/images_blogs/gadgetlab/2010/09/mobile-sensing-660x396.jpg

An Android app called Visibility, developed by researchers at University of Southern California, lets users take a photo of the sky and get data on the air quality.
The free app is currently available for phones running Android 2.1 version of the operating system.

"Airborne particulate matter is a serious threat to both our health and the environment," say the researchers on their blog. "We are working towards an optical technique to measure air visibility, and hence an estimate of some kinds of air pollution, using cameras and other sensors available on smartphones."

It's a neat idea and it's interesting to see how smartphones are giving rise to the trend of citizen science and crowdsourced data.

As smartphones become ubiquitous and increasingly powerful, researchers are increasingly using the devices to do complex computations and use it for crowdsourced data gathering. For instance, as part of a project called 'Common Sense' Intel's research labs developed sensors that could be attached to GPS-enabled phones and measure air quality.  The data gathered from these sensors would be brought back and processed to help researchers understand pollution levels.

The Visibility Android app hopes to offer something similar but make the process more user friendly.

With the Visibility app, each user photo of the sky is tagged with location, orientation and time. The data is transferred to a server where the calculations take place. The level of air quality is estimated by calibrating the images sent and comparing their intensity against an existing model of luminance in the sky, say the researchers.

The result is sent back to the user and the data is also used to create pollution maps for the region. An iPhone version of the app is in the works.


                    Visibility Monitoring using Mobile Phones



                          Sameera Poduri, Anoop Nimkar and Gaurav S. Sukhatme



                                           Department of Computer Science,

                                            University of Southern California

                                         {sameera, nimkar, gaurav}@usc.edu



ABSTRACT                                                              and wilderness regions across the country.  In 1999, the Re-

Airborne  particulate  matter  is  a  serious  threat  to  both  our  gional  Haze  Regulation  was  promulgated  which  mandates

health  and  the  environment.   It  is  also  the  primary  cause    improvement of atmospheric visibility.



for visibility degradation in urban metropolitan areas.     We

present the design, implementation, and evaluation of an op-

tical technique to measure visibility using commodity cam-

eras and other sensors commonly found on mobile phones.

The user takes a picture of the sky which is tagged with lo-

cation, orientation, and time data and transfered to a back-

end server. Visibility is estimated by first calibrating the im-

age radiometrically and then comparing the intensity with

a physics-based model of sky luminance.       We describe the

challenges for development of the system on the HTC G1

phone running the Android OS. We study the sensitivity of

the technique to error in the accelerometers and magnetome-

ters. Results from images gathered in Phoenix, Arizona and            Figure  1:     Los  Angeles  is  ranked  as  one  of  the

the Los Angeles basin compare favorably to air quality data           most  polluted  cities  in  the  country  in  terms  of

published by the US Environmental Protection Agency.                  year-round  particle  pollution



1.    INTRODUCTION

                                                                         While monitoring air visibility is important for our health

   Atmospheric visibility refers to the clarity with which dis-       as well as the environment, current monitoring stations are

tant objects are perceived. It is important as a measure of air       very sparsely deployed (figure 2). Visibility is typically mea-

quality, driving safety, and for tourism.  Without the effects        sured  using  human  observers,  optical  instruments  such  as

of manmade air pollution, the natural visual range would be           photometers and transmissometers or chemical sensors such

nearly 140 miles in western USA and 90 miles in the eastern           as  integrating  nephelometers.   While  the  human  observer

areas [1].  Today the visibility has decreased to 35-90 miles         method suffers due to subjectivity, optical and chemical mea-

in the west and 15-25 miles in the east. The atmospheric pol-         surement is very precise but expensive and requires mainte-

lutants that most often affect visibility exist as haze aerosols      nance.   In  several  developing  countries  around  the  world,

which are tiny particles (10µm and smaller) dispersed in air          there is little or no monitoring infrastructure available.

that scatter sunlight, imparting a distinctive gray hue to the           Our goal is to develop an air visibility sensing system that

sky.  The  suspended  particles  may  originate  as  emissions        uses off-the-shelf sensors and can be easily deployed to be

from natural  sources  (e.g.,  sea  salt  entrainment  and wind-      used by a large number of people.      This will enable large-

blown dust) or from manmade sources (e.g., automobile ex-             scale sensing of visibility and augment existing instrumen-

haust and mining activities). This particulate matter or PM is        tation that is precise but expensive and sparse.  We propose

cited as a key reason for heart and lung problems, especially         to use phones for two reasons - 1) phones have proliferated

in metropolitan areas such as Los Angeles [4]. Recent stud-           all over the world and can potentially allow massive sensing

ies also show that particulate matter enhances global warm-           coverage 2) most high end phones are equipped with cam-

ing [13].  Atmospheric visibility is a measure of particulate         eras and other sophisticated sensors that can be used to mea-

matter concentration [11].  The United States' Environment            sure visibility and 3) having a human in the loop can help

Protection Agency (EPA) initiated the Interagency Monitor-            intelligent data collection and also to gather data where it

ing  of  Protected  Environments  (IMPROVE)  in  1985  with           matters.

to  monitor  air  quality  and  visibility  in  157  national  parks     Our application works as follows.  The user starts an ap-



                                                                  1


----------------------- Page 2-----------------------

                               (a)                                                              (b)



Figure  2:   (a)  Average  particulate  matter  concentration  in  California.              The  orange  regions  are  above

the   air  quality   standard     stipulated    by   EPA.    (b)  Monitoring      stations    are  sparsely    deployed.      The

counties  in  white  have  no  monitoring  stations.



plication on his phone, points the phone to the sky and takes        describing related research in section 6 and conclude with a

a picture.  The application tags the image with accelerom-           discussion in section 7.

eter, magnetometer, date and time information and stores it

on the phone. It uses the GPS and time information to com-           2.   VISIBILITY

pute current solar position, appends this to the tag file, and          Visibility varies because light gets scattered and absorbed

sends it along with the image to a backend server. The solar         by particles and gases in the atmosphere.  According to the

and camera orientation data is used to compute an analytic           EPA, particulate matter pollution is the major cause of re-

model of the sky as a function of the atmospheric visibility.        duced visibility (haze) in parts of the United States. Because

By comparing this with the intensity profile of the image,            particles typically scatter more uniformly than molecules for

we estimate visibility. If the user prefers, the application can     all wavelengths, haze causes a whitening of the sky. The par-

also transfer his GPS coordinates to the backend server so           ticles can come from many sources - industrial and vehicular

that the visibility information is displayed on a map to be          emissions, volcanic eruptions, forest fires, cosmic bombard-

shared with other users.                                             ment, the oceans, etc.  A commonly used measure of atmo-

   The main contributions of this work are as follows.               spheric visibility is the meteorological range Rm    which is



                                                                     the distance under daylight conditions at which the apparent

    •  Design of a visibility estimation algorithm that takes

                                                                     contrast between a black target and its background (horizon

      as input an image of the sky, orientation of the camera

                                                                     sky) becomes equal to a threshold constant of an observer,

      and solar orientation

                                                                     and it roughly corresponds to the distance to the most distant

    •  Design and implementation of the system to on HTC             discernible geographic feature [11].  Koschmieder derived a

      G1 phones taking into account privacy and efficiency            formula that relates the meteorological range to the aerosol

      factors                                                        extinction coefficient ß.



                                                                                                     3912

    •  Evaluation of the system using images from 3 different                             Rm   =

                                                                                                  ß (Mm-1)

      sources and and analysis of effect of sensor noise

                                                                       The formula shows that visibility closely correlates with

   The paper is organized as follows.    The next section de-        aerosol load and it is therefore a good indicator for the air

fines metrics for atmospheric visibility and gives an overview        quality.

of the common methods of measuring it and its relation to              In atmospheric sciences literature,  another metric called

air quality. Section 3 presents the architecture and design of       turbidity is used. Turbidity is a measure of the fraction of

the system and its implementation on HTC G1 smartphone.              light scattering due to haze as opposed to molecules [11].

The sky luminance model, radiometric calibration technique           Formally,  turbidity T is defined as the ratio of the optical

and visibility estimation algorithm are described in section 4.      thickness of a path in the haze atmosphere (haze particles

This is followed by experimental results including sensitiv-         and molecules) to the optical thickness of the path in atmo-

ity analysis in section 5.1.  We place the work in context by        sphere with the molecules alone:



                                                                 2


----------------------- Page 3-----------------------

                                                                               and  images.   To  ensure  user  privacy,  we  process  the

                                                                               data such that the user is anonymous with respect to the

                                                                               information that leaves the phone. User's GPS coordi-

                                                                               nates are used to compute solar position on the phone

                                                                               which  is  communicated  to  the  backend  server  with-

                                                                               out the GPS data.  Similarly, the image is cropped and

                                                                               only the segment that contains the sky pixels is shared.

                                                                               However, the user has the option of sharing complete

                                                                               information which can be useful for large-scale moni-

                                                                               toring and analysis.



                                                                            •  Communication  cost: In the default usage mode,

                                                                               the data transfer is initiated immediately after data log-

                                                                               ging assuming that the phone is connected to the inter-

                                                                               net. However, we provide an option to delay the trans-

Figure 3:     Relation between turbidity and mete-                             fer until a preferred internet connection is available.

orological  range  in  km

                                                                            •  No  blocking:  After data is transfered, the response

                                                                               from the server may take a few minutes.        During this

                                                                               time, the visibility application does not block the phone.

                               tm  + th                                        Instead, it switches into a background mode where it

                         T  =

                                  tm                                           waits for a response and frees the phone for other ap-

                                                                               plications. On receiving a response it displays it as a

   where tm  is the vertical optical thickness of the molecular                notification.

atmosphere,  and th    is  the  vertical  optical  thickness  of  the

haze atmosphere.  Optical thickness t for a path of length x                •  Aiding data collection: Sensors on the phone can

is defined in terms of the extinction coefficient ß as follows                   guide data collection so that the data is well-suited for

                                                                               the estimation algorithm.     We use the phone's orien-

                        t =   x ß (x)dx                                        tation sensors to help the user hold the phone parallel



                                                                               to the ground (without roll) as that yields best results.

                              0

                                                                               Similarly, we deactivate the camera button if the zenith

   Strictly speaking, visibility is only defined for a path and                 angle is more that 100? as we are only interested in im-

not for a region. But if the region is homogenous, we can de-                  ages of the sky.

fine its visibility as that of a random path. Generally horizon-

tal paths are considered homogenous and vertical paths are                  •  Human in the loop: Several computer vision prob-

least homogenous. In fact, the aerosol concentration rapidly                   lems  that  are  extremely  challenging  to  automate  are

decreases in the vertical direction.  Most of the aerosol par-                 trivially solved by a human. In our system, segmenting

ticles exist in the region  10 - 20 km about the surface of                    sky pixels in an arbitrary image is one such problem.

earth. Turbidity and meteorological range are closely related                  When the use captures an image, we ask him to select

as shown in figure 3                                                            a part of the image that is sky. By exploiting the partic-

   In our work, we estimate turbidity directly using models                    ipatory nature of our paradigm, we can build a system

of sky appearance.                                                             that is robust and efficient.



                                                                            •  Energy  e?ciency: The sensors that consume most

3.    SYSTEM DESIGN

                                                                               energy  on  our  system  are  the  GPS  and  camera.    Of

   In this section, we present the design and implementation                   these,  the  GPS  is  not  essential  because  very  coarse,

of  the  visibility  sensing  system. Figure  4  shows  the  high              city-scale, localization is sufficient to for visibility es-

level architecture.  The user gathers a picture that gets auto-                timation. The user can turn off GPS in the application

matically tagged with relevant sensor data and transmitted to                  making it use either cell tower ID based locations or

a backend server which processes the data to estimate visi-                    the last recorded GPS locations.

bility and returns a value of turbidity to the user.  Based on

the privacy preference of the user, the image, its location and            We will now describe the details of our implementation

turbidity value are displayed on a publicly accessible map              on the HTC G1 smartphone.

and stored onto a database for future analysis.                         3.1     Hardware

   Our design is based on the following considerations.

                                                                           Current high-end mobile phones (such as the iPhone, HTC

    •  Privacy: The visibility estimation algorithm is based            G1, Nokia N97, BlackBerry, etc) are embedded with cam-

      on  potentially  sensitive  information  such  as  location       eras, accelerometers, GPS sensors and in some cases, even



                                                                     3


----------------------- Page 4-----------------------

                                             Figure  4:   Overview  of  the  system



magnetometers. We chose the HTC G1 phone because it can                Such information will be useful for further study of visibil-

sense 3D orientation and is easily programmable.                       ity conditions.  As mentioned earlier, the file transfer option

   HTC G1 runs Android OS that provides an open SDK and                controls whether sensor data will be automatically transfered

APIs to program sensors.      Android is a software stack for          to the server or stored on the phone and transfered later when

different mobile devices that includes the operating system,           the user clicks the 'Transfer files' button. This facility is use-

middleware and application level. Android runs linux kernel            ful if either the phone does not have 3G capability or if the

and applications are developed in Java to support standard             user prefers to wait for a cheaper WiFi connection. Turning

Java  APIs  as  well  as  libraries  for  Android  specific  APIs.      on the privacy filter will prevent GPS data being transfered to

The phone has significant computational capability with a               the server. If it is off, the GPS data is used to archive the sen-

528Mhz processor, 64 MB internal RAM and 128 MB in-                    sor data and display it on a publicly accessible map.  When

ternal ROM for OS and applications.       With this processing         the user clicks the start button, the image capture screen with

power,  our  on-phone  computation  of  solar  position  is  al-       the camera preview appears. On this screen, the azimuth and

most instantaneous.    The phone has a 1GB MicroSD card                                                                              ?

                                                                       zenith angles are shown in green if the roll is less than 5  ,

where the images taken from camera along with their tags               i.e., the phone is parallel to the ground.   If not, the angles

are stored.                                                            appear in red.  If the angles are in green, the user can cap-

   The phone is embedded with a 3.1 megapixel camera with              ture an image by pressing the camera button.  The image is

a dedicated camera button. It supports JPG, BMP, PNG, and              saved and displayed on the screen and the user is prompted

GIF formats. We use RGB format to store color information.             to choose two points on the image such that the rectangle

The  camera  does  not  allow  optical  zooming  which  means          formed with those points as a diagonal contains only sky pix-

that all images from a phone have a fixed focal length, thus            els.  The image is cropped to this box and sent to the server

allowing  a  one-time  calibration.  The  Android  orientation         along  with  a  separate  tag  file  containing  orientation,  date,

API  combines  information  from  a  3-axis  magnetic  sensor          time,  and  solar  position  which  is  computed  on  the  phone

and a 3-axis accelerometer to report the 3D orientation of             (as  described  in  the  next  subsection). Files  are  transfered

the phone.  It uses a dynamic offset estimation algorithm to           over standard FTP protocol in which client program runs on

compensate for the local magnetic field.       We found that in         phone and FTP server process runs on the backend server.

spite of this, there is a significant error in the azimuth val-         While the orientation data is stored at the time of image cap-

ues (figure 16). The GPS data is highly accurate (-160 dBm              ture,  the GPS sensor is invoked as soon as the application

tracking sensitivity) for our purpose.                                 starts since it can take a few seconds to locate satellites. Af-

   The backend system consists of FTP and HTTP servers                 ter  segmenting  the  sky  portion  of  the  image,  the  user  can

running on a standard desktop that runs MATLAB and com-                also provide additional information about cloud cover and

municates with the phone through internet.                             the  apparent  visibility. This  information  will  be  useful  in

                                                                       studying the performance of the system.  After this step, the

3.2    Software Design                                                 application switches to a background mode where it listens

                                                                       for a message from the backend server. The resulting turbid-

   Figure 5 shows screenshots of the phone application built           ity estimate is displayed as a notification along with time of

on Android 1.5SDK. It begins with a splash screen with op-             image capture.

tions to edit settings or capture an image.     On the settings          Figure 6 shows the flow of information and the compu-

screen the user can choose his internet tagging, privacy and           tations performed on the phone and the backend server.  On

file  transfer  settings. Turning  on  the  internet  tagging  op-      the phone end, we use location and time information to com-

tion causes the application to tag the image with additional           pute the solar azimuth and elevation angles using the algo-

data such as weather information obtained from the internet.



                                                                   4


----------------------- Page 5-----------------------

                 (a)                                   (b)                                   (c)



                 (d)                                   (e)                                   (f)



Figure  5:   The  Visibility  application  for  Android  OS.  (a)  startup  screen  with  options  to  view/edit

settings and start taking the picture.  (b) settings for communication and privacy.  If internet tagging

is turned on, the application will gather weather data.            File transfer option allows the user to choose

between transferring the image immediately or at a later more convenient time.  Privacy filter controls

whether the user's GPS data is communicated.              (c) Camera preview.        The azimuth and zenith angles

are  displayed  in  green  when the  roll  is  < 5?  and red (d) otherwise.       (e)  The user chooses a portion of



sky  for  processing.   Clicking  the  camera  button  at  this  point  stores  the  image  along  with  orientation

data.  (f)  The  computed  turbidity  value  is  returned  as  a  notification



                                          Figure  6:   System  Architecture



                                                            5


----------------------- Page 6-----------------------

rithm described in [16].  The 3D orientation computation is               the sun.   We call  f  the scaled luminance as it captures the

performed by Android's orientation API.                                   ratio of true luminance to zenith luminance.  It is defined as

   The backend server runs Perl scripts that are triggered by             follows.

the phone through HTTP requests. These scripts initiate im-

age  processing  which  includes  radiometric  correction  and             f (? , ?  , t) = (1+a·e(b/ cos(?p )))(1+c·e(d·cos(?p )+e·cos2 (?p ))

computation of image luminance.         This is implemented us-                p   p

                                                                                                                                           (2)

ing MATLAB. The camera orientation and solar orientation

                                                                             where a, b, c, d, and e are adjustable coefficients. Each of

data are used to compute the analytical model of sky lumi-

                                                                          the parameters has a specific physical effect on the sky dis-

nance.  This is followed by an optimization step to estimate

                                                                          tribution and with different values they can capture a wide

the visibility. The analytical model also uses the focal length

                                                                          variety of sky conditions. An empirical model of the param-

information which can usually be obtained from the phone.

                                                                          eters in terms of turbidity t has been proposed [14]

The resulting visibility value is sent back to the phone as a

response to the HTTP request. If the privacy filter is off, then

the image along with visibility value is sent to a web server                        ? a  ?      ?  0.1787       1.4630    ?



                                                                                     ?     ?     ?                          ?      

                                                                                        b          -0.3554       0.4275

                                                                                     ?     ?     ?                          ?

which displays them on a map (figure 7).                                                                                         t

                                                                                     ? c   ?=    ? -0.0227       5.3251     ?

                                                                                     ?     ?     ?                          ?

                                                                                                                                1

                                                                                     ?    ?      ?                         ?

                                                                                        d           0.1206      -2.5771

                                                                                        e          -0.0670       0.3703



                                                                             Figure 8 shows the variation in the scaled luminance ratio

                                                                           (f) as the value of turbidity changes.  There is a significant

                                                                          change in the shape of the surface and this can be used to

                                                                          estimate turbidity.

                                                                             Equation 2 can be expressed in terms of the pixel coordi-

                                                                          nates as shown in [9].  We reproduce the equations here for

                                                                          clarity.



                                                                                                     v    sin(?  ) + f  cos(?  )

                                                                                                        p      c       c      c

                                                                                                           

                                                                                      ?   = arccos

                                                                                       p

                                                                                                             f2  + u2  + v2

                                                                                                               c     p     p



Figure 7:  Images and the visibility estimates are                         fp = arctan fc sin fc sin ?c  - up cos fc  - vp sin fc cos ?c 



displayed  on  a  publicly  accessible  map.                                                fc cos fc sin ?c  + up sin fc  - vp cos fc cos ?c



                                                                          ?   = arccos cos(?  ) cos(?  )+sin(?  ) sin(?  ) cos(f  -f  )

                                                                            p                    s        p         s        p        p     s

4.    VISIBILITY ESTIMATION

                                                                             up  and vp  are the pixel coordinates of a point p and fc  is

   In this section,  we describe the details of the analytical            the focal length of the camera. ?p  and fp  are the correspond-

sky model, radiometric calibration and visibility estimation              ing zenith and azimuth angles and ?p  is the relative orienta-

algorithm.  The method is based on [9] where the sky lumi-                tion to solar position.  By substituting for these and the pa-

nance model is used to calibrate the geometric parameters of              rameters a, b, c, d, e in equation 2 we rewrite f  in terms of

the camera assuming clear sky images. In our case, the focal              the variables of interest to our problem, as g. We have,

length and orientation of camera are known and we seek to

estimate the turbidity of images.

                                                                                                           g(?  , f  , u  , v  , t)

                                                                                         L(?   , ?  ) = L      c   c   p   p               (3)

4.1     Sky Luminance                                                                        p   p       0   g(0, f  ,0,0, t)

                                                                                                                    s



   Several physical as well as empirical models for the sky               4.2     Radiometric Calibration

luminance have been proposed [8]. Of these, the model pro-

                                                                             Digital cameras are not designed to capture the entire radi-

posed by Perez et al [12] is shown to work well for different

                                                                           ance range in the real world. The camera applies a non-linear

weather conditions. It is a generalization of the CIE standard

                                                                          mapping called the response function to the scene radiance

clear sky formula.  The luminance of a sky element is given

                                                                          to generate a smaller, fixed, range of image intensity values.

by

                                                                          In order to measure scene radiance from image intensities,

                                                                          it is necessary to learn the inverse response function.        This

                                    f (?  , ?  , t)

                  L(?   , ?  ) = L      p   p                   (1)       is called radiometric calibration.  A common approach is to

                       p   p       0  f (0, ?  )

                                             s                            take a series of images with varying camera exposures set-

   where ?p   is the zenith of the sky element, ?p     is the angle       tings and estimate the inverse response function. This is not

between the sky element and the sun, and ?s  is the zenith of             feasible for our application because we cannot control the



                                                                      6


----------------------- Page 7-----------------------

                           fc  = 15?                               fc  = 60?                               fc  = 180?



  t  =  2



 t  =  10



 t  =  20



  t=30



Figure  8:   Illustration  of  the  Perez  model  for  sky  luminance.          The  luminance  ratio  (f)  changes  signif-

                                                                                                                                 ?

icantly  with  turbidity.      In  the  above  graphs  fc     =  2031   (based  on  HTC  G1  phone  camera),  ?c           =  45  ,

?s  = 30?  and  fs  = 0.  Note  that  the  scale  of  the z  axis  is  di?erent  for  di?erent  graphs.         In  general,  the



luminance  ratio  increases  as  the  turbidity  increases.



                                                                 7


----------------------- Page 8-----------------------

               (a)                                 (b)                                         (c)



Figure  9:     (a)  image  (b)  corners  used  for  estimation  (c)  resulting  response  functions  for  red,  blue,

green  channels



exposure settings on most phones. We use the technique pro-                The above optimization can be solved using standard tech-

posed by Lin et al [10] that uses a single image as suggested            niques such as Levenberg-Marquardt.  We use initial values

in [9].                                                                  of 2 for t and 0.5 for k and bounds of [0 40] for t and [0 1]

   The method works by first extracting color edges in an im-             for k.

age and finding an inverse response function that results in                In our experiments, we observed that the compass on the

an almost linear color blending in RGB space.  This is done              phone gives a significant error.      Therefore,  we modify the

using  maximum  a  posteriori estimation where the proba-                optimization to estimate compass offset ?f  .

                                                                                                                         c

bility of an inverse response curve is proportional to the dis-

                                                                                                                                        

tance in RGB space between edge colors and prior model is                                                                                 2

                                                                         t = argmin                  I  - kg(?  , f  + ?f  , u  , v  , t)

obtained from [6].    Both the image intensity and irradiance                          t,k,?fc        p         c   c       c  p   p

values are normalized to [0 1].      Figure 9 shows the result-                                 p

ing curves for red, green and blue channels for the HTC G1                                                                             (4)



phone obtained by calibrating over  10 different images and

3 different phones.                                                      5.    EXPERIMENTS



                                                                           This section describes a series of experiments conducted

4.3    Visibility Estimation                                             to  validate  our  system  and  study  its  sensitivity  to  error  in



   Visibility is estimated by matching the scaled luminance              sensor data.   We conducted two sets of experiments.          The

ratio (f) with the observed image intensity values at sky pix-           first  on  static  camera  setups  used  to  monitor  weather  and

els after the radiometric calibration. Intensity I is computed           visibility conditions.  These images do not have significant

from RGB values using the CIE standard formula                           orientation error and allow us to study the system on a series

I  = 0.2126R + 0.7152G + 0.0722B. We find the value of t                  of images of the same scene.  The second set of images are

that minimizes the sum of squared error between measured                 taken from the HTC G1 mobile phone using the visibility

intensity and the analytic luminance value over the set P  of            application described above.      In all these experiments,  the

sky pixels.                                                              complete image was logged and the sky part was segmented

                                                                         interactively at the backend during processing.

                                                            2

    (t, k) = argmin             I  - kg(?  , f  , u  , v  , t)           5.1    Static camera image sources

                      t,k        p         c   c   p   p

                          p?P



                                                                         South Mountain Web Camera

   The scaled luminance ratio g is computed using the solar

position, camera orientation, and focal length data reported             The Arizona Department of Environmental Quality (ADEQ)

by the phone.    Note that we cannot recover true irradiance             maintains several cameras near Phoenix.        We used images

values  from  the  image  but  only  scaled  values.   The  Perez        from a 2.4 megapixel camera located in the North Mountains

model for g also captures the luminance ratio with respect to            looking south.  The pictures from this camera are published

the zenith. Therefore there is a constant factor k between the           every  15 minutes.    We used the method in [9] to calibrate

image intensity I and f at each pixel which can be estimated.            the focal length and the azimuth and zenith angles of this



                                                                     8


----------------------- Page 9-----------------------

camera.                                                                     ity  in  the  atmosphere  caused  by  uneven  clouds  of  smoke.

                                                                            The images shown here are facing the Los Angeles National

Altadena Weather Station                                                    Forest where the fires took place.         Because the surfaces do

This 5 megapixel camera located in Altadena in California                   not match properly,  increasing t from  5 to around  17 also

looks north-northeast towards the San Gabriel valley. It takes              has very little impact on error.

a picture every 5 minutes that is published online. Again, we

                                                                             5.3    Sensitivity analysis of camera orientation

use [9] to calibrate the camera.

                                                                                    error

USC Rooftop Camera Station                                                     We conducted a series of experiments to study the error in

Recently, we set up an image station on the roof of a build-                the HTC G1 phone's orientation sensing and its impact on

ing at the University of Southern California (figure 11).            It       estimated luminance values.  To compute the error, we cap-

has an android phone placed inside a weather-proof box.  It                 tured images of the sun using our visibility application and

looks northeast at the Los Angeles downtown and logs im-                     calculated the solar position in the image using the time of

ages every 15 minutes.  The focal length of the camera was                   day, GPS, and camera orientation reported by the phone. We

estimated using the MATLAB calibration toolbox [3].                         then compared these pixel coordinates with visually detected

                                                                             solar position in each image (figure 16(a)). The resulting er-

Results                                                                     ror over 100 images taken using 8 different phones is shown

                                                                            in figure 16(b) and (c) for azimuth and zenith angles. While

Figure 10 shows three examples of images from the South                                                                 ?

                                                                            the zenith error is mostly within  ±5  , the azimuth error is

Mountain  camera  that  are  reported  to  have  good,  fair  and

                                                                             significantly larger.

poor visibility by the ADEQ. The corresponding image in-

                                                                               We analyzed the impact of error on the azimuth angle on

tensity surfaces and the scaled luminance profiles for the val-

                                                                            the intensity ratio.   Figure 15 shows the derivative of lumi-

ues of turbidity (2,3 and 6) estimated by our algorithm are

                                                                            nance ratio f (?    , ?  ) with respect to camera azimuth f       for

shown.     The  turbidity  values  increase  as  the  visibility  de-                         p   p                                          c

                                                                             different values of solar zenith.      The solar azimuth,  f  ,  is

creases and the luminance surfaces match well with the im-                                                                                   s

                                                                             fixed at 0? and the camera zenith, ?  , is fixed at 45?. Repeat-

age intensity profiles.                                                                                               c

                                                                            ing the computation for different values of  fs         and ?c   pro-

   For the camera in Altadena, there is no ground truth vis-

                                                                             duces similar graphs.  The graph shows that for fc  between

ibility data available.  Therefore, instead of comparing visi-                   ?          ?

                                                                             100   and 160  , the luminance ratio varies only slightly with

bility values, we look at the average trend in visibility during

                                                                             f  .

a day (figure 12) averaged over 50 days during March 2009                      c



to July 2009 (we eliminated days that had cloudy skies). The

turbidity values are highest around  9 am and then steadily

decrease to reach a minimum around 4 pm.             This is in fact

a well known trend in particulate matter concentration.           As

an example, we plot the PM2.5 concentration in central Los

Angeles over the same 50 days



5.2     HTC G1 Phone



   The HTC G1 phone was used to gather data using our ap-

plication in the Los Angeles basin.        The data was gathered

over a period of 3 months. We focus on 3 significant visibil-

ity events during this time when the Air Quality Index (AQI)

published by the South Coast Air Quality Management Dis-

trict (AQMD) had extreme values.           1) fire  day:      the Los

Angeles wildfires around August 26th 2009 when the AQI

was 150 and labeled 'unhealthy' 2)  hazy  day:  an extreme

low air quality day on November 8th 2009 when the AQI

was 120 and labeled 'unhealthy for sensitive groups' and 3)                  Figure  15:     Sensitivity  of  luminance  to  compass

clear  day:  an extreme clear air quality day following rain                 error

and snow on December 8th 2009 when the AQI was 23 and

labeled 'good'.  Figure 13 shows two representative images

                                                                             5.4    Localization

for each of the clear day and fire day and the luminance pro-

files for turbidity values of 2, 10, and 11.  We also show the                  The  GPS  data  is  used  to  calculate  the  positions  of  sun.

variation in error for different values of t.      For the fire day,          However, the position of sun changes very slowly with lo-

the intensity surface has a different shape compared to the                                                                              ?

                                                                             cation.   For  instance,  in  Los  Angeles  (latitude  34  N     and

analytic model.  We believe this is because of inhomogene-                                   ?

                                                                             longitude  118  W  the solar azimuth changes by less than a



                                                                        9


----------------------- Page 10-----------------------

                               Good  visibility, t = 2



                                Fair  visibility, t = 3



                               Poor  visibility, t = 6



    (a) Image                        (b) Intensity                (c) model luminance



           Figure  10:  Results  from  the  South  Mountain  Camera



(a)                                  (b)                                   (c)



                  Figure  11:   USC  Rooftop  Camera  Station



                                          10


----------------------- Page 11-----------------------

                             (a)                                                          (b)



Figure 12:     (a) Scatter plot of pollution data during a single day averaged over  50 days in central Los

Angeles.     (b) Average visibility data estimated using images over  50 days from a camera in Altadena



degree for over  60 miles.    Therefore,  a city-level localiza-         Sky  model  based  camera  calibration: Models of

tion can compute the solar position accurately enough.  The            sky appearance have been studied for several years and the

sensitivity of luminance to solar azimuth is exactly the same          computer graphics community has used these models for re-

as camera azimuth because the model only uses relative az-             alistic rendering for scenes. In the past few years, these mod-

imuth angle.                                                           els have been used to analyze images to study photometric

                                                                       properties of scenes [18]. Recently, Lalonde et al, have cali-

6.    RELATED WORK                                                     brated the focal lengths and orientations or a large number of

   In this section we review current research in three areas           web cameras using images available on the internet [9]. We

related to our work.                                                   use the same approach to estimate visibility. The key differ-

                                                                       ence in our case is that unlike web cameras, the camera is

   Image processing based visibility mapping: There

                                                                       our case can be controlled and its geometric parameters are

is a growing interest in monitoring environmental conditions

                                                                       known.

using commodity cameras.       Several approaches have been

proposed to compute visibility from images. In [17], visibil-

                                                                       7.   CONCLUSION

ity is computed as the ratio of pixel contrast just below and

just above the horizon.  The method relies on being able to              Air  quality  is  a  serious  global  concern  that  affects  our

detect the horizon accurately which is challenging especially          health as well as the environment.  While several efforts are

in poor visibility conditions.  It is well suited for static web       underway  to  monitor and disseminate air  quality informa-

cameras  where  the  horizon  does  not  change  and  therefore        tion, the sensing locations are still extremely sparse because

can be computed on a clear day. Another approach is to use             the monitoring stations are expensive and require careful in-

Fourier analysis of the image since clear images are likely            stallation and maintenance.  Our vision is to augment these

to have more high frequency components ??. This approach               precise but sparse air quality measurements with coarse, large-

applies to cases where the objects in the image are at a simi-         scale  sensing  using  commodity  sensors.    To  this  end,  we

lar distance away as the visibility range. Other methods such          propose a system that uses cameras and other sensors com-

as  [2]  and  [15]  require  detection  of  a  large  number  of  vi-  monly available on phones to estimate air visibility.    Using

sual targets either manually or automatically.  All the above          accelerometer and magnetometer data along with coarse lo-

methods are based on image processing alone.        In contrast,       cation information,  we generate an analytic model for sky

the approach we propose is based on a analytic model of sky            appearance  as  a  function  of  visibility. By  comparing  this

appearance that takes into account the camera's orientation            model with an image taken using the phone, visibility is es-

and solar position.                                                    timated.  We present the design, implementation and evalu-

   Air  quality  mapping  using  phones: Owing to the                  ation of the system on the HTC G1 phone running Android

ubiquity of mobile phones, researchers have proposed inte-             OS with a backend server. To ensure user privacy, GPS data

grating chemical sensors with mobile phones to measure air             is processed on the phone and the image is cropped to con-

quality [5, 7] thus allowing large-scale sensing.     Our work         tain only sky pixels before sharing it with the server. Our re-

is  in  the  same  spirit  of  participatory  sensing  but  we  seek   sults show that the system can reliably distinguish between

to measure air quality using sensors commonly available on             clear and hazy days.

phones.                                                                  While our initial results are promising, several challenges



                                                                  11


----------------------- Page 12-----------------------

                                             Clear  day,  t  =  2,  AQI  23



                                             Clear  day,  t  =  2,  AQI  23



                                        LA  wildfires  day,  t  =  11,  AQI  150



                                        LA  wildfires  day,  t  =  10,  AQI  150



Figure  13:   Comparison  of  images  taken  during  wildfires  in  Los  Angeles  on  30th  August  2009  with

those  taken  on  a  very  clear  day  on  8th  December  2009



                                                           12


----------------------- Page 13-----------------------

                                             Clear  day,  t  =  2,  AQI  23



                                             Hazy  day,  t  =  9,  AQI  120



Figure  14:   Comparison  of  images  taken  during  a  hazy  day  on  8th  November  2009  with  those  taken

on a very clear day on 8th December 2009.            In this case, the pictures were taken at the same time of

the  day  3pm



               (a)                                     (b)



                                 Figure  16:  Compass  error  in  HTC  G1  phones



                                                           13


----------------------- Page 14-----------------------

exist. First, the model assumes that the atmosphere is ho-                In submission, International Journal on

mogenous.  This is generally true in the horizontal direction             Computer Vision, 2009.

but when looking at an angle the haze tends to be layered.          [10]  S. Lin, J. Gu, S. Yamazaki, and H. Shum. Radiometric

The impact of this is clear in the images we took during wild-            calibration from a single image. In Computer Vision

fires in Los Angeles.  To address this we plan to investigate              and Pattern Recognition, 2004. CVPR 2004.

further sky models and also use the sensors on the phone to               Proceedings of the 2004 IEEE Computer Society

guide the user to gather data at favorable angles. Second, we             Conference on, volume 2, 2004.

currently assume that the after the user crops the image, it        [11]  E. McCartney. Optics of the Atmosphere:

does not have any clouds.  While cloudy skies are a funda-                Scattering by molecules and particles. John Wiley,

mental limitation of this approach, we plan to develop seg-              New York, 1976.

mentation techniques that will allow us to use disconnected         [12]  R. Perez, R. Seals, and J. Michalsky. All-weather

cloud-free sky segments.                                                 model for sky luminance distribution. Preliminary

                                                                          configuration and validation. Solar Energy,

Acknowledgements                                                          50(3):235–245, 1993.

We thank Bill Westphal and the Arizona Department of En-            [13]  K. Prather. Our Current Understanding of the Impact

vironmental  Quality  for  sharing  the  Altadena  web  camera            of Aerosols on Climate Change. ChemSusChem,

data and the Phoenix South Mountain camera data respec-                   2(5), 2009.

tively.                                                             [14]  A. Preetham, P. Shirley, and B. Smits. A practical

                                                                          analytic model for daylight. In Proceedings of the

8.   REFERENCES                                                           26th annual conference on Computer graphics and

                                                                          interactive techniques, pages 91–100. ACM

  [1]  U. E. P. Agency. Visibility in mandatory federal class I          Press/Addison-Wesley Publishing Co. New York, NY,

      areas, 1994-1998 a report to congress, 2001.                       USA, 1999.

  [2]  D. Baumer, S. Versick, and B. Vogel. Determination of        [15]  D. Raina, N. Parks, W. Li, R. Gray, and S. Dattner. An

     the visibility using a digital panorama camera.                     Innovative Methodology for Analyzing Digital

      Atmospheric Environment, 42(11):2593–2602,

                                                                         Visibility Images in an Urban Environment. Journal

      2008.

                                                                          of the Air & Waste Management Association,

  [3]  J.-Y. Bouguet. Camera calibration toolbox for matlab,              55(11):1733–1742, 2005.

      2008.

                                                                    [16]  I. Reda and A. Andreas. Solar position algorithm for

  [4]  D. Dockery, C. Pope, X. Xu, J. Spengler, J. Ware,                  solar radiation applications. Solar Energy,

     M. Fay, B. Ferris, and F. Speizer. An association                   76(5):577–589, 2004.

     between air pollution and mortality in six US cities.          [17]  L. Xie, A. Chiu, and S. Newsam. Estimating

      The New England journal of medicine,

                                                                         Atmospheric Visibility Using General-Purpose

      329(24):1753, 1993.

                                                                          Cameras. In Proceedings of the 4th International

  [5]  P. Dutta, P. Aoki, N. Kumar, A. Mainwaring,                        Symposium on Advances in Visual Computing,

      C. Myers, W. Willett, and A. Woodruff. Common                       Part II, page 367. Springer, 2008.

      Sense: participatory urban sensing using a network of         [18]  Y. Yu and J. Malik. Recovering photometric properties

     handheld air quality monitors. In Proceedings of the                 of architectural scenes from photographs. In

      7th ACM Conference on Embedded Networked                            SIGGRAPH '98:  Proceedings of the 25th annual

      Sensor Systems, pages 349–350. ACM, 2009.                           conference on Computer graphics and interactive

  [6]  M. Grossberg and S. Nayar. What is the space of                    techniques, pages 207–217, New York, NY, USA,

      camera response functions? In IEEE Computer                         1998. ACM.

      Society conference on Computer Vision and

      Pattern Recognition (CVPR), volume 2. Citeseer,

      2003.

  [7]  R. Honicky, E. Brewer, E. Paulos, and R. White.

     N-smarts: networked suite of mobile atmospheric

     real-time sensors. In Proceedings of the second

      ACM SIGCOMM workshop on Networked systems

     for developing regions, pages 25–30. ACM, 2008.

  [8]  P. Ineichen, B. Molineaux, and R. Perez. Sky

     luminance data validation: comparison of seven

     models with four data banks. Solar Energy,

      52(4):337–346, 1994.

  [9]  J.-F. Lalonde, S. G. Narasimhan, and A. A. Efros.

     What do the sun and the sky tell us about the camera?



                                                                14


Bookmark and Share
posted by u2r2h at Saturday, September 25, 2010

2 Comments:

Anonymous Bronx Air Conditioners said...

That's actually amazing and I am glad that this is a nice and interesting feature added.There is a need to get conscious about the environment and health facts and act upon them.

Thursday, September 30, 2010 at 9:32:00 AM PDT  
Anonymous Brandon M. Monroe said...

With bagless vacuums, it’s good practice to empty the dirt bin after every use. Many machines have a max line that serves as a helpful reminder, if you can’t make emptying the bin part of your vacuuming routine.

Thursday, November 16, 2017 at 3:27:00 PM PST  

Post a Comment

<< Home