Big Data related to Climate Change

a

It has been after centuries that we are concerned that we live in a planet with limited resources and the gravity of the climate change brought by humans activities, that  is increasing so fast that there might be no return. Any of us think it is our problem and we blame to others, but the reality is that it is a problema of all of us, and so on it affects all of us. An example is the hundred of millions of humans affected by natural desasters, such as the tsunami in Indonesia, or the earthquackes in Japan, Haiti, Chile and Italy.

Another big discovery during these last years has been the data revolution. During the last 5 years we have generated more content than during the whole human history. The use of big data is changing the way of thinking. Now decisions are made on real time using data from multiple sources, analyzed and procesed by new tecnologies.

The relation between big data and the climate is essential to study the consecuences of climate change in a long period of time with precision. Millions of data circulate right now around the planet, from weather predictions on TV to information recorded by a pluviometer of any agency. Data and more data that hide the perfect prediction of the future climate. Knowing how to manage and study that quantity of information to predict the consecuences of global warming is what Barcelona Supercomputing Centre (BSC) is exploiting. Proof of it is their new supercomputer, which can predict the weather for the next 80 years. The capacity of it’s driver is of 5 petabytes, 5.000 times a home PC.

Between the predictions that BSC have done, they have predicted less rains and 7 degrees more in the mediterranean area for the end of this century.

BSC is working to create a singular model that simulates a climate system for all Europe, and BSC is one of the only institutions that have capacity to compile all this quantity of data.

Also, for COP 21 in Paris, they had a huge intergovernmental report about climate change realized by about thirty institutions from all over the world. Those simulations generated about 5 petabytes of data. If one researcher need to go through all these models, he would need a huge quantity of data, and he would have to travel around the world to do his research. The part of big data that solve this problem consist on developing several portals that will be distributed all over the world, that would reproduce the experiments that the researchers have carried out in different countries. That way the researchers have access to all this data in a faster and more efficient way.

b

 


Suscribirse a comentarios Respuestas cerradas, se permiten trackback. |

Comentarios cerrados.


Este sitio web utiliza cookies para que usted tenga la mejor experiencia de usuario. Si continúa navegando está dando su consentimiento para la aceptación de las mencionadas cookies y la aceptación de nuestra política de cookies, pinche el enlace para mayor información.plugin cookies

ACEPTAR
Aviso de cookies