Mar 29, 2012 (12:03 PM EDT)
White House Shares $200 Million Big Data Plan
Read the Original Article at InformationWeek
The initiative comes as volumes of data used by government and the private sector expand exponentially. It includes commitments from several federal agencies to develop new technologies to manipulate and manage big quantities of data and use those technologies in science, national security, and education. John Holdren, director of the White House's Office of Science and Technology Policy, compared the effort to federal research that led to breakthroughs in supercomputing and to the development of the Internet.
"While the private sector will take the lead on big data, we believe that the government can play an important role, funding big data research, launching a big data workforce, and using big data approaches to make progress on key national challenges," Holdren said in a press conference to announce the effort. The government is also helping to set big data standards.
The federal agencies working on the initiative will be the National Science Foundation, the National Institutes of Health, the Department of Defense, the Department of Energy, and the U.S. Geological Survey.
[ Federal agencies are making efforts to streamline and consolidate their IT investments. Read more: Fed CIOs Try To End Duplicate IT Investments. ]
Among the big data projects will be a joint solicitation from the National Science Foundation and the National Institute for Health, which will award a up to $25 million in funding for 15 to 20 research projects that, according to the solicitation, will "advance the core scientific and technological means of managing, analyzing, visualizing, and extracting useful information from large, diverse, distributed, and heterogeneous data sets."
In addition to the big data solicitation, the National Science Foundation is also implementing a long-term big data strategy that includes encouraging research, funding a $10 million data project at the University of California, Berkeley, support for a geosciences data effort called Earth Cube, and more.
The Department of Defense, meanwhile, plans to spend about $250 million annually, including $60 million on new research projects, on big data. The Defense Advanced Research Projects Agency is creating the XDATA program, a $100 million effort over four years to "develop computational techniques and software tools for sifting through large structured and unstructured data sets."
The National Institutes of Health announced as part of the effort that it has placed 200 Tbytes of genomic data--the world's largest set of human genetic data, according to the White House--on Amazon Web Services as part of the international 1000 Genomes Project.
The Department of Energy is no stranger to big data, being home to some of the most powerful supercomputers in the world. As part of the big data initiative, the agency's Lawrence Berkeley National Laboratory will spend $25 million to create a new research facility, the Scalable Data Management, Analysis, and Visualization Institute.
In a blog post accompanying the announcement, OSTP deputy director Tom Kalil called on industry, universities, and non-profit organizations to join the administration in its efforts.
For their part, technology companies applauded the effort. "The administration's work to advance research and funding of big data projects, in partnership with the private sector, will help federal agencies accelerate innovations in science, engineering, education, business and government," said David McQueeney, VP of software for IBM Research.
Attend InformationWeek's IT Government Leadership Forum, a day-long venue where senior IT leaders in government come together to discuss how they're using technology to drive change in federal departments and agencies. It happens in Washington, D.C., May 3.