Journal Articles
The links below take you to (free) PDFs of the article in question. Because of journal copyright issues, these are typically the final submitted draft to the journals, and not in the published format. The content will be the same, but page numbers and layout will differ.
- Christopher Karstens, K. Shourd, D. Speheger, A. Anderson, R. Smith, D. Andra, T. Smith, V. Lakshmanan, S. Erickson 2016: Evaluation of near real-time preliminary tornado damage paths. J. Operational Meteor., 4 (10), 132-141, doi: http://dx.doi.org/10.15191/nwajom.2016.0410.
- Travis M. Smith, Valliappa Lakshmanan, Gregory J. Stumpf, Kiel L. Ortega, Kurt Hondl, Karen Cooper, Kristin M. Calhoun, Darrel M. Kingfield, Kevin L. Manross, Robert Toomey, Jeff Brogden Multi-radar Multi-sensor (MRMS) Severe Weather and Aviation Products: Initial Operating Capabilities, Bulletin of the American Meteorological Society, 0, 0, 2016, doi: http://dx.doi.org/10.1175/BAMS-D-14-00173.1 The Multi-Radar Multi-Sensor (MRMS) system, which was developed at the National Severe Storms Laboratory and University of Oklahoma, was made operational in 2014 at the National Centers for Environmental Prediction. Products created by the MRMS system are at a spatial resolution of approximately 1 km, with 33 vertical levels, updating every 2 minutes over the Coterminous United States and southern Canada. This paper describes initial operating capabilities for the severe weather and aviation products that include a three dimensional mosaic of reflectivity, guidance for hail, tornado, and lightning hazards, and nowcasts of storm location, height and intensity.
- A. Clark, MacKenzie, A., McGovern, A., Lakshmanan, V., and Brown, R. An automated, multi-parameter dryline identification algorithm, Weather and Forecasting, vol. 30, p. 1781–1794, 2015. This study aims to streamline dryline identification by developing an automated, multi-parameter dryline identification algorithm, which applies image-processing and pattern recognition techniques to various meteorological fields and their gradients to identify drylines. The algorithm is applied to five years of high-resolution 24 h forecasts from Weather Research and Forecasting (WRF) model simulations valid April - June 2007-2011. Manually identified dryline positions, which were available from a previous study using the same dataset, are used as "truth" to evaluate the algorithm performance.
- Y. Hwang, A. Clark, V. Lakshmanan, and S. Koch, “Improved nowcasts by blending extrapolation and model forecasts,” Journal of Applied Meteorology, vol. 30, p. 1201–1217, 2015. A new approach to apply different weights to blend extrapolation and model forecasts based on intensities and forecast times is applied and tested. An image processing method of morphing between extrapolation and model forecasts to create nowcasts is described and the skill compared to extrapolation forecasts and forecasts from the HRRR model. The new approach is called “Salient cross-dissolve” (Sal CD), which is compared to a commonly used method called “Linear cross- dissolve” (Lin CD). Use this paper to cite w2morphtrack
- Y. Hwang, T. Yu, V. Lakshmanan, and D. Kingfield, D. Lee, C. You, “Neuro-Fuzzy gust front detection algorithm with s-band polarimetric radar,” IEEE Transactions on Geoscience and Remote Sensing, vol. PP, no. 99, p. 1-11, 2016. Six parameters from polarimetric WSR-88D Level II data are used to characterize gust front signatures and a gust front detection algorithm is developed. DOI: 10.1109/TGRS.2016.2628520
- V. Lakshmanan, B. Herzog, and D. Kingfield, “A method of extracting postevent storm tracks,” J. Appl. Meteo. Clim., vol. 54, pp. 451–462, 2 2015. Although existing storm tracking algorithms have been designed to operate in real-time, they are also commonly used to carry out post-event data analysis and research. Real-time algorithms can not use information on the subsequent positions of a storm because it is not available at the time that associations between frames are carried out, but post-event analysis is not similarly constrained. Therefore, it should be possible to obtain better tracks for post-event analysis than what a real-time algorithm is capable of. In this paper, we describe a statistical procedure to determine storm tracks from a set of identified storm cells over time. We find that this procedure results in fewer, longer-lived tracks at all scales. Use this paper to cite w2besttrack
- V. Lakshmanan, C. Karstens, J. Krause, K. Elmore, A. Ryzhkov, and S. Berkseth, “Which polarimetric variables are important for weather/no-weather discrimination?,” J. Atmos. Ocean. Tech., vol. 32, no. 6, p. 1209-1223, 2015. The importance of the different variables in the context of discriminating between weather and no-weather echoes is examined. The same statistical framework can be used to study the impact of calibration errors in variables such as Zdr. Among the variables studied for their impact on the quality control of radar data, the most important were the variance of Zdr, features relating to the 3D structure of the reflectivity and the radial derivative of PhiDP. The effect of Zdr calibration error on weather/no-weather discrimination were found to be negligible. Use this paper to cite w2qcnndp (2/2)
- L. Tang, J. Zhang, C. Langston, J. Krause, K. Howard, and V. Lakshmanan, “A physically based weather/non-weather radar echo classifier using polarimetric and environmental data in a real- time national system,” J. Atmos. Ocean. Tech., vol. 29, p. 1106–1119, 2013. A hydrology-specific QC control method developed at NSSL, a multi-sensor physically based algorithm is designed to classify weather/ non-weather radar echoes. Comparing to other quality control methodologies using all polarimetric variables, this algorithm's advantage is in its simplicity, effectiveness and computation efficiency.
- K. Elmore, Z. Flamig, V. Lakshmanan, B. Kaney, V. Farmer, and L. Rothfusz, “MPING: Crowd- sourcing weather reports for research,” Bulletin of the American Meteorological Society, vol. 95, p. 1335–1342, 2013. We describe taking advantage of smartphones to crowd-source weather reports for research. The Weather Service Radar 88 Doppler (WSR-88D) network within the United States has recently been upgraded to include dual-polarization capability. One of the expectations that have resulted from the upgrade is the ability to discriminate between different precipitation types in winter precipitation events. To know how well any such algorithm performs, and whether new algorithms are an improvement, observations of winter precipitation type are needed. Unfortunately, the automated observing systems cannot discriminate between some of the more important types. Thus human observers are needed. Yet, to deploy dedicated human observers is wasteful and unnecessary because the knowledge needed to identify the various precipitation types is common among the public. To most efficiently gather such observations requires the public to be engaged as Citizen Scientists using a very simple, convenient, non-intrusive method. Thus, a very simple âappâ, called mPING (mobile Precipitation identification bear the ground) that runs on the ubiquitous âsmartâ phone, or more generically, web-enabled devices with GPS location capability, are employed to pass observations to researchers at no additional cost to either the public or the research project. Deployed in mid-December 2012, mPING has proven to be not only very popular, but also provides consistent, accurate observational data.
- V. Lakshmanan, C. Karstens, J. Krause, and L. Tang, “Quality control of weather radar data using polarimetric variables,” J. Atm. Ocea. Tech., vol. 31, pp. 1234–1249, 6 2014. We describe a quality control method to QC non-Doppler radar data. At each range gate, a pattern vector consisting of the values of the polarimetric and Doppler moments, local variance of some of these features as well as 3D features computed in a virtual volume sense is computed. If the pattern can not be preclassified based on RhoHV and Z, they are presented to a neural network that was trained on historical data. The neural network and preclassifier produce a pixelwise probability of precipitation at that range gate. The range gates are then clustered into contiguous regions of reflectivity, with bimodal clustering carried out close to the radar and clustering based purely on spatial connectedness further away from the radar. The pixelwise probabilities are averaged within each cluster and the cluster either retained or censored depending on whether this average probability is greater than or less than 0.5. The QC algorithm was evaluated on a set of independent cases and found to perform well, with a Heidke Skill Score (HSS) of about 0.8. A simple gate-by-gate classifier, consisting of three simple rules, is also introduced in this paper and can be used if the full QC method is not able to be applied. The simple classifier has a HSS of about 0.6 on the independent dataset. Use this paper to cite w2qcnndp (1/2)
- T. Smith, J. Gao, K. Calhoun, D. Stensrud, K. Manross, K. Ortega, C. Fu, D. Kingfield, K. Elmore, V. Lakshmanan, and C. Riedel, “Examination of a real-time 3DVAR analysis system in Hazardous Weather Testbed,” Wea. Forecasting, vol. 39, pp. 63–77, 2014. We describe how forecasters and research meteorologists tested a real-time three-dimensional 20 variational data assimilation (3DVAR) system in the Hazardous Weather Testbed during 21 the springs of 2010-2012 to determine its capabilities to assist in the warning process for 22 severe storms. This storm-scale system updates a dynamically consistent three-23 dimensional wind field every 5 minutes, with a horizontal and average vertical resolution 24 of 1 km and 400 m, respectively.
- V. Lakshmanan and T. W. Humphrey, “A MapReduce technique to mosaic continental-scale weather radar data in real-time,” IEEE J. of Select Topics in Appl. Earth Obs. and Remote Sensing, vol. 7, no. 2, pp. 721–732, 2014. DOI: 10.1109/JSTARS.2013.2282040. Because of the high temporal and spatial resolution of data available from the United States' network of weather radars, creating radar mosaics in real-time has been possible only through compromises on the quality, timeliness or resolution of the mosaics. MapReduce is a programming model that can be employed for processing and generating large data sets by distributing embarrasingly parallel computations and data storage across a distributed cluster of machines. A MapReduce approach to computing radar mosaics on a distributed cluster of compute nodes is presented. The approach is massively scalable, and is able to create high-resolution 3D radar mosaics over the Continental United States in real-time. Use this paper to cite w2merger (2/2)
- J. Kain, M. Coniglio, J. Correia, A. Clark, P. Marsh, C. Ziegler, V. Lakshmanan, S. Miller, S. Dembek, S. Weiss, F. Kong, M. Xue, R. Sobash, R. Dean, I. Jirak, and C. Melick, “A feasibility study for probabilistic convection initiation forecasts based on explicit numerical guidance,” Bull. Amer. Meteor. Soc., vol. 94, pp. 1213–1225, 2013. We describe the significant convection initiation component of the 2011 Spring Forecasting Experiment in the NOAA Hazardous Weather Testbed (HWT). The overarching goal of the CI component was to define the primary challenges of CI prediction and establish a framework for additional studies and possible routine forecasting of CI. As in previous HWT experiments, the CI study was a collaborative effort between forecasters and researchers. As such, it involved experimental modeling activities, including high-resolution ensemble prediction, and experimentation in probabilistic forecasting of CI. A summary of CI-related activities from the experiment is presented, including working definitions, methods for detection and prediction, and strategies for making and verifying CI forecasts.
- J. Gao, T. Smith, T. Stensrud, C. Fu, K. Calhoun, K. Manross, J. Brogden, V. Lakshmanan, Y. Wang, K. Thomas, K. Brewster, and M. Xue, “A realtime weather-adaptive 3DVAR analysis system for severe weather detections and warnings with automatic storm positioning capability,” Wea. Forecasting, vol. 28, pp. 727–745, 2013. We describe how a real-time, weather-adaptive three-dimensional variational data assimilation (3DVAR) system has been adapted for the NOAA Warn-on-Forecast (WoF) project to incorporate all available radar observations within a moveable analysis domain. The goal of this real-time 3DVAR system is to help meteorologists better track severe weather events and eventually provide better warning information to the public, ultimately saving lives and reducing property damage. The unique features of the system include: (1) incorporating radar observations from multiple WSR-88Ds, (2) the ability to automatically detect and analyze severe local hazardous weather events at 1km horizontal resolution every 5 minutes in real time based on the current weather situation, and (3) the identification of strong circulations embedded in thunderstorms. Although still in the early development stage, the system performed very well within the NOAA's Hazardous Weather Testbed (HWT) Experimental Warning Program during the spring of 2010 when many severe weather events were successfully detected and analyzed.
- M. Miller, V. Lakshmanan, and T. Smith, “An automated method for depicting mesocyclone paths and intensities,” Wea. Forecasting, vol. 28, pp. 570–585, 2013. The location and intensity of mesocyclone circulations can be tracked in real-time by accumulating, over time, areas of high azimuthal shear computed from Doppler velocity observations at low (0-3 km AGL) and mid-levels (3-6 km AGL). Azimuthal shear is computed in a noise-tolerant manner by fitting the Doppler velocity observations in the neighborhood of a pulse volume to a plane and finding the slope of that plane. Rotation tracks created in this manner are contaminated by nonmeteorological signatures caused by poor velocity dealiasing, ground clutter, radar test patterns and spurious shear values. In order to improve the quality of these fields for real-time use and for an accumulated multi-year climatology, new dealiasing strategies, data thresholds, and Multiple Hypothesis Tracking (MHT) techniques have been implemented. These techniques remove nearly all non-meteorological contaminants and successfully isolate the rotation tracks. The resulting rotation tracks are closely associated with mesocyclone paths and intensities. Storm survey information is then used to compare the rotation tracks to observed tornado damage. Use this paper to cite w2circ and/or rotation-tracks (3/3)
- J. Buler, V. Lakshmanan, and D. La Puma, “Improving weather radar data processing for biological research applications: Final report,” Tech. Rep. G11AC20489, Patuxent Wildlife Research Center, USGS, Laurel, MD, 2012. Technical Report to the Patuxent Wildlife Research Center. Use this paper to cite w2birddensity
- V. Lakshmanan, M. Miller, and T. Smith, “Quality control of accumulated fields by applying spatial and temporal constraints,” J. Atmos. Ocean. Tech., vol. 30, pp. 745–757, 2013. We note that accumulating gridded fields over time greatly magnifies the impact of noise in the individual grids. A quality control method that takes advantage of spatial and temporal coherence can reduce the impact of noise in accumulation grids. Such a method can be implemented using the image processing techniques of hysteresis and multiple hypothesis tracking (MHT). These steps are described in this paper and the method is applied to simulated data to quantify the improvements and explain the effect of various parameters. Finally, the quality control technique is applied to some illustrative real-world datasets. Use this paper to cite w2circ and/or rotation-tracks (2/3)
- V. Lakshmanan, K. Hondl, C. Potvin, and D. Preignitz, “An improved method to compute radar echo top heights,” Wea. Forecasting, vol. 28, pp. 481–488, 4 2013. It is demonstrated that the traditional method, in widespread use on NEXRAD and other radar systems, to compute echo top heights results in both under- and overestimates. It is proposed that echo tops be computed by interpolating between elevation scans that bracket the echo top threshold. The traditional and proposed techniques are evaluated using simulated radar samples of a modeled thunderstorm and by sampling a high-resolution Range Height Indicator (RHI) of a real thunderstorm. It is shown that the proposed method results in smaller errors when higher elevation scans are available. Use this paper to cite w2echotop
- J. Sieglaff, D. Hartung, W. Feltz, L. Cronce, and V. Lakshmanan, “Development and application of a satellite-based convective cloud object-tracking methodology: A multipurpose data fusion tool,” J. Applied Meteorology and Clim., vol. 30, pp. 510–525, 3 2013. We build upon the Warning Decision Support System â Integrated Information (WDSS-II) object tracking capabilities. The system uses an IR-window based field as input to WDSS-II for cloud-object identification and tracking and a UW-CIMSS developed post-processing algorithm to combine WDSS-II cloud-object output. The final output of the system is used to fuse multiple meteorological datasets into a single cloud-object framework. The object tracking system performance is quantitatively measured for 34 convectively active periods over the central and eastern United States during 2008 and 2009. The analysis shows improved object-tracking performance with both increased temporal resolution of the geostationary data and increased cloud-object size. The system output is demonstrated as an effective means for fusing a variety of meteorological data including: raw satellite observations, satellite algorithm output, radar observations and derived output, numerical weather prediction model output, and lightning detection data for studying the growth of deep convective clouds.
- J. Newman, V. Lakshmanan, P. Heinselman, M. Richman, and T. Smith, “Range-correcting azimuthal shear in doppler radar data,” Wea. Forecasting, vol. 28, pp. 194–211, 2013. We develop a range correction for LLSD shear so as to better enable tornado detection from Doppler velocity data. We examin linear regression and artificial neural networks as range correction models and find that both methods produce good fits for simulated shear data. We find that range correction increases tornadic shear values by nearly an order of magnitude, faciliating differentiation between tornadic and nontornadic scans in the tornadic events. Use this paper to cite w2circ and/or rotation-tracks (1/3)
- V. Lakshmanan, J. Crockett, K. Sperrow, M. Ba, and L. Xin, “Tuning the auto-nowcaster automatically,” Wea. Forecasting, vol. 27, no. 6, pp. 1568–1579, 2012. A genetic algorithm approach to tuning the ANC is described. The process consisted of choosing data sets, employing an objective forecast verification technique and a fitness function. The ANC was modified to operate create forecasts offline using parameters iteratively generated by the genetic algorithm. The parameters are generated by probabilistically combining parameters that result in better performance, leading to better and better parameters as the tuning process proceeds. The forecasts created by ANC using the automatically determined parameters are compared with the forecasts created by ANC using parameters that were the result of human tuning.
- P. Marsh, J. Kain, , V. Lakshmanan, A. Clark, N. Hitchens, and J. Hardy, “A method for calibrating deterministic forecasts of rare events,” Wea. Forecasting, vol. 27, pp. 531–538, 2012. We propose a method of deriving calibrated probabilistic forecasts of rare events from deterministic forecasts by fitting a parametric kernel density function to the model's historical spatial error characteristics. This kernel density function is then applied to individual forecast fields to produce probabilistic forecasts.
- V. Lakshmanan, R. Rabin, J. Otkin, J. Kain, and S. Dembek, “Visualizing model data using a fast approximation of a radiative transfer model,” J. Atmos. Ocean. Tech., vol. 29, pp. 745–754, 2012. We demonstrate that it is possible to approximate the radiative transfer model using an universal approximator whose parameters can be determined by fitting the output of the forward model to inputs derived from the model forecasts from which it was computed. The resulting approximation is very close to the complex radiative transfer model and has the advantage that it can be computed in a matter of minutes. This approximation is carried out on model forecasts to demonstrate its utility as a visualization and forecasting tool.
- A. Zahraei, K. Hsu, S. Sorooshian, J. Gourley, V. Lakshmanan, Y. Hong, and T. Bellerby, “Quantitative precipitation nowcasting: A lagrangian pixel-based approach,” Atmos. Research, vol. 118, pp. 418–434, 2012. We introduce a pixel-based algorithm for Short-term Quantitative Precipitation Forecasting (SQPF) using radar-based rainfall data. The proposed algorithm called Pixel- Based Nowcasting (PBN) tracks severe storms with a hierarchical mesh-tracking algorithm to capture storm advection in space and time at high resolution from radar imagers. The extracted advection field is then extended to nowcast the rainfall field in the next 3 hr based on a pixel-based Lagrangian dynamic model. The proposed algorithm is compared with two other nowcasting algorithms (WCN: Watershed-Clustering Nowcasting and PER: PERsistency) for ten thunderstorm events over the conterminous United States. Object-based verification metric and traditional statistics have been used to evaluate the performance of the proposed algorithm. It is shown that the proposed algorithm is superior over comparison algorithms and is effective in tracking and predicting severe storm events for the next few hours.
- J. Cintineo, T. Smith, V. Lakshmanan, H. Brooks, and K. Ortega, “An objective high-resolution hail climatology of the contiguous united states,” Wea. Forecasting, vol. 27, pp. 1235–1248, 2012. We develop a hail climatology over the CONUS and compare the results with a reports-based climatology. Past hail climatology has typically relied on National Oceanic and Atmospheric Administration.s (NOAA) National Climatic Data Center's (NCDC) Storm Data, which has numerous reporting biases and non-meteorological artifacts. This research seeks to quantify the spatial and temporal characteristics of contiguous U.S. (CONUS) hail fall, derived from multi-radar multi-sensor (MRMS) algorithms for several years during the Next-Generation Radar (NEXRAD) era, leveraging the Multi-Year Reanalysis Of Remotely Sensed Storms (MYRORSS) dataset at NOAA.s National Severe Storms Laboratory (NSSL). The primary MRMS product used in this study is the maximum expected size of hail (MESH). The preliminary climatology includes 42 months of quality-controlled and re-processed MESH grids, which spans the warm seasons for 4 years (2007-2010), covering 98% of all Storm Data hail reports during that time. The dataset has 0.01o latitude x 0.01o longitude x 31 vertical levels spatial resolution, and 5-minute temporal resolution. Radar-based and reports-based methods of hail climatology are compared. MRMS MESH demonstrates superior coverage and resolution over Storm Data hail reports, and is largely unbiased. The results reveal a broad maximum of annual hail fall in the Great Plains and a diminished secondary maximum in the southeast U.S. Potential explanations for the differences in the two methods of hail climatology are also discussed.
- V. Lakshmanan, “Image processing of weather radar reflectivity data: Should it be done in z or dbz?,” Elec. J. Severe Storms Meteo., vol. 7, no. 3, pp. 1–4, 2012. We address the common belief that processing, such as interpolation or smoothing, of weather radar reflectivity fields ought to be carried out in the reflectivity factor (Z) and not on its logarithm (dBZ). It is demonstrated in this note that such faith is misplaced i.e. that processing in dBZ is better.
- V. Lakshmanan, J. Zhang, K. Hondl, and C. Langston, “A statistical approach to mitigating persistent clutter in radar reflectivity data,” IEEE J. Selected Topics in Applied Earth Observations and Remote Sensing, vol. 5, pp. 652–662, 4 2012. We describe a statistical approach to creating a clutter map from "found data" i.e. data not specifically collected in clear air. Different methods of mitigating ground clutter are then compared using an information theory and statistical approach and the best mitigation approach chosen. The technique described in this paper allows for the mitigation of persistent ground clutter returns in data where signal processing techniques have not been applied or have been conservatively applied. It is also helpful for correcting mobile radar data where the creation of a clear-air clutter map is impractical. Accordingly, the technique is demonstrated in each of the above situations. Use this paper to cite w2cluttermap
- A. Hobson, V. Lakshmanan, T. Smith, and M. Richman, “An automated technique to categorize storm type from radar and near-storm environment data,” Atmos. Research, vol. 111, no. 7, pp. 104–113, 2012. An automated technique to storm classification is described. Storms are identified and clustered within CONUS radar and environmental data using a K-means clustering and watershed segmentation technique. Decision trees are trained individual attributes of these storms and implemented to predict storm types at multiple scales.
- M. Zhu, V. Lakshmanan, P. Zhang, Y. Hong, K. Cheng, and S. Chen, “Spatial verification using a true metric,” Atmospheric Research, vol. 102, no. 4, pp. 408–419, 2011. We introduce a spatial verification metric that is capable of ordering forecasts without the necessity to carry out filtering, warping or windowing operations on the images. This metric is illustrated on synthetic and real model forecasts of precipitation.
- S. McCarroll, M. Yeary, D. Hougen, V. Lakshmanan, and S. Smith, “Approaches for compression of super-resolution WSR-88D data,” IEEE Tran. on Geosci. and Remote Sensing Letters, vol. PP, no. 99, pp. 191–195, 2010. We introduce a lossless radial-by-radial compression method that takes advantage of special properties of weather radar data. The method was tested on Level II reflectivity data from several S-band Doppler weather radars and compared with two general-purpose compression algorithms. The newly developed algorithm was approximately 15% better than the next best approach.
- S. Sen Roy, V. Lakshmanan, S. Roy Bhowmik, and S. Thampi, “Doppler weather radar based now- casting of cyclone ogni,” J. Earth Syst. Sci., vol. 119, no. 2, pp. 183–199, 2010. We describe offline analysis of Indian Doppler Weather Radar (DWR) data from cyclone Ogni using a suite of radar algorithms as implemented on NEXRAD and the advanced algorithms developed jointly by the National Severe Storms Laboratory (NSSL) and the University of Oklahoma. We demonstrate the applicability of the various algorithms to Indian radar data, the improvement in the quality control and evaluate the benefit of nowcasting capabilities in Indian conditions. New information about the tropical cyclone structure, as derived from application of the algorithms is also discussed in this study. Finally, we suggest improvements that could be made to the Indian data collection strategies, networking and real-time analysis. Since this is the first study of its kind to process and utilize DWR data in a tropical climate, the suggestions on real-time analysis and data collection strategies made in this paper, would in many cases, be beneficial to other countries embarking on DWR network modernization programs.
- V. Lakshmanan and J. Kain, “A Gaussian mixture model approach to forecast verification,” Wea. Forecasting, vol. 25, no. 3, pp. 908–920, 2010. We introduce a new approach in which the observed and forecast fields are broken down into a mixture of Gaussians and the parameters of the Gaussian Mixture Model fit are examined to identify translation, rotation and scaling errors. We discuss the advantages of this method in terms of the traditional filtering or object-based methods and interpret resulting scores on a standard verification dataset.
- V. Lakshmanan and T. Smith, “An objective method of evaluating and devising storm tracking algorithms,” Wea. Forecasting, vol. 25, no. 2, pp. 721–729, 2010. We introduce a set of easily computable bulk statistics that can be used to directly evaluate the performance of tracking algorithms on specific characteristics. We apply the evaluation method to a diverse set of radar reflectivity data cases and note the characteristic behavior of five different storm tracking algorithms proposed in the literature and now employed in widely used nowcasting systems. Based on this objective evaluation, we devise a storm tracking algorithm that performs consistently and better than any of the previously suggested techniques. Use this paper to cite w2segmotionll (4/4) and w2scoretrack
- V. Lakshmanan, K. Elmore, and M. Richman, “Reaching scientific consensus through a competition,” Bull. of Amer. Meteo. Soc., vol. 91, pp. 1423–1427, 2010. We describe the AI competition that we (members of the STAC commitee for AI in the American Meteo. Society) have been conducting for the past two years and what we learned in each year.
- T. Smith and V. Lakshmanan, “Real-time, rapidly updating severe weather products for virtual globes,” Computers and Geosci., 2009. DOI: 10.1016/j.cageo.2010.03.023. We demonstrate that the availability of standards for the specification and transport of virtual globe data products has made it possible to generate spatially precise, geo-referenced images and to distribute these centrally-created products via a web server to a wide audience. In this paper, we describe the data and methods for enabling severe weather threat analysis information inside a KML framework. The method of creating severe weather diagnosis products that are generated and translating them to KML and image files is described. We illustrate some of the practical applications of these data when they are integrated into a virtual globe display. The availability of standards for interoperable virtual globe clients has not completely alleviated the need for custom solutions. We conclude by pointing out several of the limitations of the general-purpose virtual globe clients currently available. Use this paper to cite the WDSS-II display.
- V. Lakshmanan, J. Zhang, and K. Howard, “A technique to censor biological echoes in radar reflectivity data,” J. Applied Meteorology, vol. 49, pp. 435–462, 3 2010. We describe a technique that identifies candidate bloom based on the range-variance of reflectivity in areas of bloom, and uses the global, rather than local, characteristic of the echo to discriminate between bloom and wide-spread rain. Every range gate is assigned a probability that it corresponds to bloom using morphological operations and a neural network is trained using this probability as one of the input features. We demonstrate that this technique is capable of identifying and removing echoes due to biological targets and other types of artifacts while retaining echoes that correspond to precipitation. Use this paper to cite w2qcnn (2/2)
- V. Lakshmanan and T. Smith, “Data mining storm attributes from spatial grids,” J. Ocea. and Atmos. Tech., vol. 26, no. 11, pp. 2353–2365, 2009. A technique to identify storms and capture scalar features within the geographic and temporal extent of the identified storms is described. The identification technique relies on clustering grid points in an observation field to find self-similar and spatially coherent clusters that meet the traditional understanding of what storms are. From these storms, geometric, spatial and temporal features can be extracted. These scalar features can then be data mined to answer many types of research questions in an objective, data-driven manner. This is illustrated by using the technique to answer questions of forecaster skill and lightning predictability. Use this paper to cite w2segmotionll (3/4).
- V. Lakshmanan, K. Hondl, and R. Rabin, “An efficient, general-purpose technique for identifying storm cells in geospatial images,” J. Atmos. Oceanic Technol., vol. 26, no. 3, pp. 523–37, 2009. An efficient sequential morphological technique called the watershed transform is adapted and extended so that it can be used for identifying storms. The parameters available in the technique and the effect of these parameters are also explained. The method is demonstrated on different types of geospatial radar and satellite images. Pointers are provided on the effective choice of parameters to target the resolutions, data quality constraints and dynamic range found in observational datasets. Use this paper to cite w2segmotionll (2/4).
- I. Adrianto, T. Trafalis, and V. Lakshmanan, “Support vector machines for spatiotemporal tornado prediction,” Int’l J. of General Systems, vol. 38, no. 7, pp. 759–776, 2009. DOI:10.1080/03081070601068629. We extend our earlier study (published in IJCNN) to use a set of 33 storm days and demonstrate that it is possible to create a principled estimate of the probability of a tornado at a particular location within a circumscribed time window. The use of support vector machines for predicting the location and time of tornadoes is presented. We utilize a least-squares methodology to estimate shear, quality control of radar reflectivity, morphological image processing to estimate gradients, fuzzy logic to generate compact measures of tornado possibility and support vector machine classification to generate the final spatiotemporal probability field. On the independent test set, this method achieves a Heidke Skill Score (HSS) of 0.60 and a Critical Success Index (CSI) of 0.45.
- V. Lakshmanan, T. Smith, G. J. Stumpf, and K. Hondl, “The warning decision support system – integrated information,” Wea. Forecasting, vol. 22, no. 3, pp. 596–612, 2007. we lay out the case for multi-radar, multi-sensor weather applications, describe the WDSS-II framework and briefly touch upon many of the WDSS-II applications including the display. This is the paper to cite if you wish to cite WDSS-II from your research papers.
- V. Lakshmanan, A. Fritz, T. Smith, K. Hondl, and G. J. Stumpf, “An automated technique to quality control radar reflectivity data,” J. Applied Meteorology, vol. 46, pp. 288–305, Mar 2007. We describe an automated technique to perform quality control on WSR-88D reflectivity data.In this paper, we use a neural network to combine the individual features, some of which have already been proposed in the literature and some of which we introduce in this paper, into a single discriminator that can distinguish between ``good'' and ``bad'' echoes. The gate-by-gate discrimination provided by the neural network is followed by more holistic post-processing based on spatial contiguity constraints and object identification to yield quality-controlled radar reflectivity scans that have most of the bad echo removed, while leaving most of the good echo untouched. The quality control algorithm described in this paper is compared against the quality control algorithm currently used in operations on the NEXRAD Radar Products Generator. A possible multi-sensor extension to this technique is demonstrated. Use this paper to cite w2qnn (1/2)
- N. Pal, A. Mandal, S. Pal, J. Das, and V. Lakshmanan, “Fuzzy rule-based approach for detection of bounded weak-echo regions in radar images,” J. Appl. Meteo. and Clim., vol. 45, no. 9, pp. 1304–1312, 2006. A method for the detection of a bounded weak-echo region (BWER) within a storm structure that can help in the prediction of severe weather phenomena is presented. A fuzzy rule-based approach that takes care of the various uncertainties associated with a radar image containing a BWER has been adopted. The proposed technique automatically finds some interpretable (fuzzy) rules for classification of radar data related to BWER. The radar images are preprocessed to find subregions (or segments) that are suspected candidates for BWERs. Each such segment is classified into one of three possible cases: strong BWER, marginal BWER, or no BWER. In this regard, spatial properties of the data are being explored. The method has been tested on a large volume of data that are different from the training set, and the performance is found to be very satisfactory. It is also demonstrated that an interpretation of the linguistic rules extracted by the system described herein can provide important characteristics about the underlying process.
- V. Lakshmanan, T. Smith, K. Hondl, G. J. Stumpf, and A. Witt, “A real-time, three dimensional, rapidly updating, heterogeneous radar merger technique for reflectivity, velocity and derived products,” Wea. Forecasting, vol. 21, no. 5, pp. 802–823, 2006. We describe a technique for taking the base radar data, and derived products, from multiple radars and combining them in real-time into a rapidly updating 3D merged grid such that an estimate of that radar product combined from all the different radars can extracted from these agents at any time. We describe the model for the intelligent agents so as to account for the varying radar beam geometry with range, vertical gaps between radar scans, lack of time synchronization between radars, varying beam resolutions between different types of radars, beam blockage due to terrain, differing radar calibration and inaccurate time stamps on radar data. Further, we describe techniques for merging scalar products like reflectivity as well as innovative, real-time techniques for combining velocity and velocity-derived products. We also describe precomputation techniques that can be utilized such that users can select a domain and start seeing three-dimensional, combined radar data over that domain in a matter of minutes. Finally, we also give pointers on derived products that can be computed from these three-dimensional merger grids. This is paper 1 (of 2) to cite w2merger.
- V. Lakshmanan, “A separable filter for directional smoothing,” IEEE Geosci. Remote Sensing Letters, vol. 1, pp. 192–195, 7 2004. We develop a directional filter that is separable (45). A directional filter allows you to smooth an image without degrading edges. A separable filter is computationally efficient, but allows you to do logical operations on the pixels (such as checking for missing data). The separable filter is demonstrated on radar imagery. Use this paper to cite w2smooth (2/2).
- V. Lakshmanan, R. Rabin, and V. DeBrunner, “Multiscale storm identification and forecast,” J. Atm. Res., vol. 67, pp. 367–380, July 2003. We describe a method of multiscale storm identification, computing motion estimates and making short range forecasts from radar and satellite images. Use this paper to cite w2segmotionll (1/4).
- V. Lakshmanan, “Speeding up a large scale filter,” J. of Oc. and Atm. Tech., vol. 17, pp. 468–473, April 2000. I show how to speed up an elliptically-shaped filter so that the filtering process can be carried out in real-time. The trade-offs that this involves, including avoiding operations that are non-linear in the initial stages are described. This method can not deal with missing data. Use this paper to cite w2smooth (1/2).
- V. Lakshmanan, “Using a genetic algorithm to tune a bounded weak echo region detection algorithm,” J. of Applied Meteorology, vol. 39, pp. 222–230, 2 2000. I describe how to set up a weather detection algorithm so that it is possible to tune that algorithm as more and more cases get verified. I also discuss the possibility of being able to tune an algorithm to the local climatology. I demonstrate this method on the BWER detection algorithm and describe the genetic algorithm that can make all this possible.
- C. Marzban and V. Lakshmanan, “On the uniqueness of gandin and murphy’s equitable performance measures,” Monthly Wea. Review, vol. 127, pp. 1134–1136, June 1999. Gandin and Murphy showed that a measure they called the True Skill Score is unique in a property they call equitability. We show that it is impossible to come up with any measure that is ``equitable'' (infact, any measure that involves constraints of the form posed by equitability) without making some rather daring assumptions.
- V. Lakshmanan and A. Witt, “A fuzzy logic approach to detecting severe updrafts,” AI Appl., vol. 11, pp. 1–12, May 1997. The May issue of AI Applications carried this (50) comprehensive paper on the fuzzy logic scheme to detect Bounded Weak Echo Regions (BWERs) in radar images.
Book Chapters
- V. Lakshmanan and D. Kingfield, ``Extracting the climatology of thunderstorms,'' in Machine Learning and Data Mining Approaches to Climate Science (V. Lakshmanan, E. Gilleland, A. McGovern, and M. Tingley, eds.), New York: Springer, 2015. We describe a fully automated method of identifying, tracking and clustering thunderstorms to extract such a climatology and demonstrate it by deriving the climatology of thunderstorm initiations over the Continental United States. The identification is based on the extended watershed algorithm of Lakshmanan et al (2009), the tracking based on the greedy optimization method suggested in Lakshmanan and Smith (2010) and the clustering is the Theil-Sen clustering method introduced in Lakshmanan et al (2014). This method was utilized to radar data collected across the conterminous United States for the year 2010 in order to determine the location of all thunderstorm initiations that year. 81% of all thunderstorm initiation points occurred in the spring and summer months and were widely dispersed across all states. The remaining 19% occurred in the fall and winter months and a majority of these points were spatially dispersed across the southern half of the United States.
- P. Joe, S. Dance, V. Lakshmanan, D. Heizenrehder, P. James, P. Lang, T. Hengstebeck, Y. Feng, P. Li, H. Yeung, O. Suzuki, K. Doi, and J. Dai, Doppler Radar Observations - Weather Radar, Wind Profiler, Ionespheric Radar, and Other Advanced Applications, ch. Automated Processing of Doppler Radar Data for Severe Weather Warnings, pp. 33-74. Intech, 2012. [link to book] [link to chapter] We discuss the development of automated algorithms to process Doppler radar data for the purpose of forecasting severe weather in the short term and end with recommendations for any national weather service that is considering the creation of such severe weather warnings.
- S. Haupt, V. Lakshmanan, C. Marzban, A. Pasini, and J. Williams, Artificial Intelligence Methods in the Environmental Sciences, ch. Environmental Science Models and Artificial Intelligence, pp. 3-13. Springer, 2009. [link to book] We discuss the growing importance of data-driven (as opposed to dynamics-based) methods in the creation of weather forecasts and in other environmental science applications.
- V. Lakshmanan, Artificial Intelligence Methods in the Environmental Sciences, ch. Automated Analysis of Spatial Grids, pp. 329-346. Springer, 2009. [link to book] This chapter of ``Environmental Applications of Artificial Intelligence'', covers image processing techniques which play a key role in artificial intelligence applications operating on spatial data.