Visual Weather Temperature Prediction

Wei-Ta Chu, Kai-Chia Ho, and Ali Borji, 

Dept. of Computer Science and Information Engineering, National Chung Cheng University
Center for Research in Computer Vision, University of Central Florida


1. Introduction

In this paper, we attempt to employ convolutional recurrent neural networks for weather temperature estimation using only image data. We study ambient temperature estimation based on deep neural networks in two scenarios a) estimating temperature of a single outdoor image, and b) predicting temperature of the last image in an image sequence. In the first scenario, visual features are extracted by a convolutional neural network trained on a large-scale image dataset. We demonstrate that promising performance can be obtained, and analyze how volume of training data influences performance. In the second scenario, we consider the temporal evolution of visual appearance, and construct a recurrent neural network to predict the temperature of the last image in a given image sequence. We obtain better prediction accuracy compared to the state-of-the-art models. Further, we investigate how performance varies when information is extracted from different scene regions, and when images are captured in different daytime hours. Our approach further reinforces the idea of using only visual information for cost efficient weather prediction in the future.

2. Dataset

Dataset1:

This dataset is constituted based on the SkyFinder dataset [13], which consists of a large scale webcam images annotated with sky regions is adopted.Currently, for each camera, only the images captured around 11am on each day are selected. After data filtering, there are 35,417 images in total.

Dataset2:

The scene images mentioned in the Glasner dataset [5] are used as the seed. The Glasner dataset consists of images continuously captured by 10 cameras in 10 different environments for two consecutive years. According to the camera IDs mentioned in the Glasner dataset, we collect the entire set of its corresponding images from the AMOS dataset. In addition, according to the geographical information and the timestamp associated with each image, we obtain the temperature of each image from the cli-MATE website. Overall, we collect 58,555 images from 9 cameras in total (one camera¡¦s information is incorrect, and we could not successfully collect the corresponding temperature values).Please notice that the number 58,555 is different from the number 53,378 mentioned in our WACV paper. This is due to some errors in our previous experiments, but the conclusion derived from experiments should not differ significantly.

3. Citation

Please cite our work if you utilize this dataset.

Wei-Ta Chu, Kai-Chia Ho, and Ali Borji, "Visual Weather Temperature Prediction," Proceedings of IEEE Winter Conference on Applications of Computer Vision, pp. 234-241, 2018.

 


Any problem please contact .

[Main Page]

Last Updated: June 5, 2018