A GPU deep learning metaheuristic based model for time series forecasting.

Resumo

As the new generation of smart sensors is evolving towards high sampling acquisitions systems, the amount of information to be handled by learning algorithms has been increasing. The Graphics Processing Unit (GPU) architecture provides a greener alternative with low energy consumption for mining big data, bringing the power of thousands of processing cores into a single chip, thus opening a wide range of possible applications. In this paper (a substantial extension of the short version presented at REM2016 on April 19–21, Maldives [1]), we design a novel parallel strategy for time series learning, in which different parts of the time series are evaluated by different threads. The proposed strategy is inserted inside the core a hybrid metaheuristic model, applied for learning patterns from an important mini/microgrid forecasting problem, the household electricity demand forecasting. The future smart cities will surely rely on distributed energy generation, in which citizens should be aware about how to manage and control their own resources. In this sense, energy disaggregation research will be part of several typical and useful microgrid applications. Computational results show that the proposed GPU learning strategy is scalable as the number of training rounds increases, emerging as a promising deep learning tool to be embedded into smart sensors.

Descrição

Palavras-chave

Deep learning unit, Graphics processing, Hybrid forecasting model, Smart sensors

Citação

COELHO, I. M. et al. A GPU deep learning metaheuristic based model for time series forecasting.  Applied Energy, v. 1, p. 412–418, 2017. Disponível em: <https://www.sciencedirect.com/science/article/pii/S0306261917300041>. Acesso em: 16 jan. 2018.

Avaliação

Revisão

Suplementado Por

Referenciado Por