Recomendamos la lectura de las siguientes publicaciones:
1.- The 4th Paradigm (in english)
Gordon Bell | Microsoft Research
This book is about a new, fourth paradigm for science based on data intensive computing. In such scientific research, we are at a stage of development that is analogous to when the printing press was invented. Printing took a thousand years to develop and evolve into the many forms it takes today. Using computers to gain understanding from data created and stored in our electronic data stores will likely take decades—or less. The contributing authors in this volume have done an extraordinary job of helping to refine an understanding of this new paradigm from a variety of disciplinary perspectives.
In many instances, science is lagging behind the commercial world in the ability to infer meaning from data and take action based on that meaning. However, commerce is comparatively simple: things that can be described by a few numbers or a name are manufactured and then bought and sold. Scientific disciplines cannot easily be encapsulated in a few understandable numbers and names, and most scientific data does not have a high enough economic value to fuel more rapid development of scientific discovery.
It was Tycho Brahe’s assistant Johannes Kepler who took Brahe’s catalog of systematic astronomical observations and discovered the laws of planetary motion. This established the division between the mining and analysis of captured and carefully archived experimental data and the creation of theories. This division is one aspect of the Fourth Paradigm.
In the 20th century, the data on which scientific theories were based was often buried in individual scientific notebooks or, for some aspects of “big science,” stored on magnetic media that eventually become unreadable. Such data, especially from individuals or small labs, is largely inaccessible. It is likely to be thrown out when a scientist retires, or at best it will be held in an institutional library until it is discarded. Long-term data provenance as well as community access to distributed data are just some of the challenges.
Fortunately, some “data places,” such as the National Center for Atmospheric Research1 (NCAR), have been willing to host Earth scientists who conduct experiments by analyzing the curated data collected from measurements and computational models. Thus, at one institution we have the capture, curation, and analysis chain for a whole discipline.
In the 21st century, much of the vast volume of scientific data captured by new instruments on a 24/7 basis, along with information generated in the artificial worlds of computer models, is likely to reside forever in a live, substantially publicly accessible, curated state for the purposes of continued analysis. This analysis will result in the development of many new theories! I believe that we will soon see a time when data will live forever as archival media—just like paper-based storage— and be publicly accessible in the “cloud” to humans and machines. Only recently have we dared to consider such permanence for data, in the same way we think of “stuff” held in our national libraries and museums! Such permanence still seems far-fetched until you realize that capturing data provenance, including individual researchers’ records and sometimes everything about the researchers themselves, is what libraries insist on and have always tried to do. The “cloud” of magnetic polarizations encoding data and documents in the digital library will become the modern equivalent of the miles of library shelves holding paper and embedded ink particles.
In 2005, the National Science Board of the National Science Foundation published “Long-Lived Digital Data Collections: Enabling Research and Education in the 21st Century,” which began a dialogue about the importance of data preservation and introduced the issue of the care and feeding of an emerging group they identified as “data scientists”: The interests of data scientists—the information and computer scientists, database and software engineers and programmers, disciplinary experts, curators and expert annotators, librarians, archivists, and others, who are crucial to the successful management of a digital data collection—lie in having their creativity and intellectual contributions fully recognized.” (www.ncar.ucar.edu)
2.- The Global Information Technology Report 2010-2011
The Global Information Technology Report 2010–2011 is a special project within the framework of World Economic Forum’s Centre for Global Competitiveness and Performance and the Industry Partnership Programme for Information Technology and Telecommunications Industries. It is the result of a collaboration between the World Economic Forum and INSEAD.
The last decade has seen information and communication technologies (ICT) dramatically transforming the world, enabling innovation and productivity increases, connecting people and communities, and improving standards of living and opportunities across the globe.
While changing the way individuals live, interact, and work, ICT has also proven to be a key precondition for enhanced competitiveness and economic and societal modernization, as well as an important instrument for bridging economic and social divides and reducing poverty.
As we celebrate the 10th anniversary of the Global Information Technology Report (GITR) series and the extraordinary achievements ICT has already made possible over the past 10 years, we also want to take the opportunity to look forward and imagine the next transformations enabled by ICT—transformations 2.0.
The pace of technological advance is accelerating and ICT is increasingly becoming a ubiquitous and intrinsic part of people’s behaviors and social networks as well as of business practices and government activities and service
provision. We expect transformations 2.0 to continue to move human progress forward by further leveraging ICT’s positive social, political, and economic impact on governments, enterprise, and civil society alike.
The GITR series has been published by the World Economic Forum in partnership with INSEAD since 2001, accompanying and monitoring ICT advances over the last decade as well as raising awareness of the importance of ICT diffusion and usage for long-term competitiveness and societal well-being. Through the lens of the Networked Readiness Index (NRI), the driving factors of networked readiness and ICT leveraging have been identified, highlighting the joint responsibility of all social actors, namely individuals, businesses, and governments, in this respect. The series has become over time one of the most respected studies of its kind.
It has been extensively used by policymakers and relevant stakeholders as a unique tool to identify strengths on which to build and weaknesses that need to be addressed in national strategies for enhanced networked readiness.
The Global Information Technology Report 2010–2011 features the latest results of the NRI, offering an overview of the current state of ICT readiness in the world. This year’s coverage includes a record number of 138 economies from both the developing and developed world, accounting for over 98 percent of global GDP. A number of essays and case studies on transformations 2.0 and best practices in networked readiness are featured in the Report, together with a comprehensive data section—including detailed profiles for each economy covered and data tables with global rankings for the NRI’s 71 indicators.
Chief Business Officer, World Economic Forum
Recomendamos su participación en el evento ¿Hacia dónde va la ciencia en Méxic0?
Hacia una agenda Nacional en Ciencia, Tecnología e Innovación.