.

Sunday, March 31, 2019

Computer Vision In Bad Weather.

computing device flock In Bad Weather.Saswati RakshitAimTo progress to advantage of bad suffer in theme of judgment of a prognosis from its attri bute. As in bad weather atmosphere modulates true information of an grasp to the beholder so based on observation,we start out cast methods for recovering outlook properties(e.g. 3D structure,depth etc).Scope/ApplicationComputer Vision is widely used in various fields at a time a days.It is used in Optical character recognition engineering science to convert s keisterned docs to textFace detection,Smile detection Many new digital cameras now detect faces and smiles.surveillance and traffic monitoring.Image to a 3D sample turning a collection of photographs into a 3D archetypeGoogle Self driving Car uses cipherr vision for distance estimationIntroduction Vision and AtmosphereNorm exclusivelyy in good weather we assume reflected discharge passes by dint of halo without fading.so it is assumed brightness of an image t ear down in the characterization will be same.But due to atmospherical disperse,absorption and emission light enthusiasm and color are altered. here our of import love is on dispersal.Bad weather(Particles in space)- weather condition disaccord in type and size of instalments and their concentration.Air (molecule) scattering due to air is minimalHaze (aerosol) haze is certain to effect visibi literaturey. mist (water droplet) Fog and haze has similar origins.but haze ext deceases to superlative of several miles while cloud is few hundred feet thick.Cloud is present in high altitude.pelting and snow both make in image. here(predicate) our master(prenominal) consideration is on haze and conceal because they appear in low altitude as compared to cloud.Mechanisms of atmospheric scatteringScattering is dependent on subdivision size and shape.small particles scatter equally in forward and backward,medium size particle scatters more in forward localiseion and large particl e scatters all in forward direction.In nature particles are separated from distributively other so they scatter independently.i.e. do not interfere others.but In multiple scattering a particle is exposed not unaccompanied incident light but in any case light scattered by other particles.Single scattering function tail assembly be write as followsI(,)=E().(,) (1)Where E() is total incident flux on the volume per unit of measurement cross section areaI(,) is flux radiated per unit solid be given per unit volume of medium and (,) is the angular scattering coefficientObjectives To identify effects caused by bad weather that stern be turned to our advantages.understanding attenuation and airlight model that is helpful to measure depth maps of bursts without making assumption slightly scene properties or the atmospheric conditions.System flowhither our main goal is to estimate depth and forming 3D of a scene in bad weather condition.For this purpose we used Two different scatter ing model1) Attenuation model2) Airlight model presently first we guard used attenuation model and In this model image is interpreted at night.so environmental illumination are minimal. To estimate depth of light quotations in the scene from deuce images taken under different atmospheric conditions.And applying different mathematical formula used in attenuation model we can compute carnal knowledge depth of all sources in the scene from dickens images taken under deuce different weather condition. future(a) to work with airlight model we need images in day or when environmental illumination can not be ignored.that is image of a scene is effected by airlight.After selecting the 2D image we apply mathematical formulas of airlight model and comparing the passion of scene point depth can be easily measured an 3D reconstruction of that scene is also possible.Mathmatics And DescriptionAttenuation ModelWe know that beam of light that travels with atmosphere can be attenuated by sca ttering.and the radiancy(intensity) decreases if pathlength adds.Attenuation model developed by McCartney is summarized belowIf a beam passing through a small mainsheet(medium) of thickness dx, intensity scattered by the sheet can be create verbally as followsI(,)=E().(,) dxit represents scattering in directionNow total flux scattered in all direction is obtained by integrating over entire spherical sheet()=E().() dx -(2)fractional deepen in ir lambency at location x can be written as follows-(3)By integrating both side of eqn(3) between limits x=0 and x=d we getE(d,)= -(4)Where I0() is the intensity of the point source and d is the distance between design and observerSometimes attenuation due to scattering can be expressed in hurt of opthalmic thickness which isT=here is constant over prospecttal pathhither eqn (4) gives direct transmission which we get after removing scattered flux.Airlight ModelHere atmosphere behaves as source of light.environmental illumination has several light sources including direct sunlight,diffuse skylight and light reflected by the ground.In airlight model light intensity increases with pathlength and so apparent brightness increases. If the object is in infinite distance the radiance of airlight is maximum and radiance of airlight for an object right in front of the observer is zero.To describe the geome evaluate of that model,first we need to consider environmental illumination along the observers line of sight is assumed to be constant but direction and intensity is unknown.Let the cone of solid angle d subtended by a receptor at observer end.and truncated by the object at distance d.This cone between observer and object scatters environmental illumination in the direction of observer.so it acts as airlight(source of light) whose brightness increases with pathlength.So the small volume dV at distance x from observer is dV= d x2 dxNow the intensity of light incident on dV isdI(x,)= dV k = d x2 dx k (5)now light scatt ers in dV.so irradiance it produces at observer end isdE(x,) = .(6)also given in eqn (4)Now we can find radiance of dV from its irradiance asdL(x,) = = ..(7)by substituting (5) we get, dL(x,)=now we will find total radiance of pathlength d from observer to object by integrating the above sort between x=0 to x=dL(d,)= k (1-) .(8)If d = the radiance of airlight is maximum L(,=kSo , L(d,)= L(, (1-) (9)Estimation of depth exploitation Attenuation ModelIn this model image is taken at night.so environmental illumination are minimal and so airlight model is not chosen.At night bright points of image are normally street light,windows of lit rooms.In clear night these light sources are visible to observer in brightest and clearest form but in bad weather condition the intensity diminish due to attenuation.Our goal is to estimate depth of light sources in the scene from two images taken under different atmospheric conditions.Here image irradiance can be written victimisation eqn(4) asE(d, )= g (10)g is optical parameters of cameraIf the detector of the camera has spectral response s(),he final image brightness value isE/== (11)We know spectral bandwidth of camera is particular so we can assume as constant.And we can write,E/=g=g I/ (12)Now if we take image in two different weather condition i.e. in mild and thick fog then there will be two different scattering coefficient. Let it will be 1 and 2.now if we take ratio of two resulting image brightness we getR== -(13)Using natural put down R/=ln R= ..(14)This ratio is independent of camera sensor realize and intensity of source.In fact it is only difference in optical thickness(DOT) of the source for two weather conditions.Now if we compute the DOT of two different light source and take the ratio we determine proportional depths of two source locationsSo we can write, = .(15)Since we may not only trust the DOT computed for any single source.so above calculation can be made more robust = ..(16)here we assume to fi nd the intensity of a single source pi,which is at distance di from observer.so to calculate its relative depth from other sources we need to compute depth of all sources of the scene upto a scale factorThe main goal of using this model is to compute relative depth of all sources in the scene from two images taken under two different weather condition.Estimation of depth using Airlight ModelAt noon or daytime in dense haze or fog or mild fog nearly visible scene points are not illuminated and airlight effects.airlight causes intensity to increase when distance increases.Here we consider a single airlight image and try to compute 3d scene structure by measuring depth cues.Let,a scene point is at distance d and produce airlight radiance L(d,).if our camera has spectral response S(The brightness value of that scene point isE/(d)= .(17)Substituting it by eqn (9),we getE/(d)= (18)If is constant we can write,E/(d)= (19)Now Let,S= (20)By substituting eqn(19) at eqn (20),and winning natu ral logarithm we can write,S/= ln S = -d (21)Here S/ is scale factor and a 3D structure of scene can be recovered upto this scale factorThe part of horizon in the image which has intensity will be the brightest region of the image.(sky background) succeeding(a) workNext we will understand and discuss about bicolored Atmospheric Scattering and structure from Chromatic Decomposition.Referenceshttp//www.canberra.edu.au/irps/archives/vol21no1/blbalaw.html (Accessed on 20.04.2015)Narasimhan, S. G., Nayar, S. K., Vision and the Atmosphere, International Journal of Computer Vision, vol. 48(3), pp. 233254, 2002.Allards Law, http//eilv.cie.co.at/term/34. (Accessed on 18.03.2015)Relation between Radiance and Irradiance, 2013, http//physics.stackexchange.com/questions/68353/relation-between-radiance-and-irradiance. (Accessed on 18.03.2015)Radiaometry and Photometry, http//electron6.phys.utk.edu/optics421/modules/m4/radiometry.htm (Accessed on 28.03.2015

No comments:

Post a Comment