A vertical community of Federal big data leaders, project managers, industry, and government IT community stakeholders focused on public-private collaboration and best-practice exchange.
The Internal Revenue Service’s research unit has a suitably expansive agenda for big data. The tax agency pulls in voluminous data on revenue collection, refunds, and enforcement efforts. The task of the IRS Research, Analysis, and Statistics division is to sort through that data and help the tax agency make better decisions. The division’s responsibilities include econometric modeling, forecasting, and compliance studies. It also serves as IRS’ focal point for developing research databases and supporting infrastructure. The research group taps big data to support its activities. The data-driven approach promotes greater efficiency in resources used for tax administration, according to Jeff Butler, director of Research Databases within the IRS Research, Analysis, and Statistics division.
NASA’s Jet Propulsion Laboratory, like many large organizations, is taking on the Big Data problem: the task of analyzing enormous data sets to find actionable information. In JPL’s case, the job involves collecting and mining data from 22 spacecraft and 10 instruments including the Mars Science Laboratory’s Curiosity rover and the Kepler space telescope. Tom Soderstrom, IT chief technology officer at JPL, joked that his biggest Big Data challenge is more down to Earth: dealing effectively with his email inbox. But kidding aside, JPL now confronts Big Data as a key problem and a key opportunity. “If we define the Big Data era as beginning where our current systems are no longer effective, we have already entered this epoch,” Soderstrom explained.
For a tough big data challenge, look no further than the U.S. Postal Service (USPS). USPS faces a classic double-whammy: the agency has to collect and crunch massive amounts of data, and tackle the job quickly. Speed is important as the agency aims to detect fraud, update customers tracking mail pieces, and respond to requests from regulators. The postal service has responded with an architectural approach designed to rapidly ingest and process data culled from thousands of sources throughout the postal enterprise. Scot Atkins, program manager, Supercomputing & Revenue Protection, USPS, has helped shape the postal service’s big data efforts. He cited the agency’s push for real-time processing as perhaps its biggest big data challenge.
MeriTalk sat down with Scott Pearson, director, big data solutions, Brocade to discuss the state of Big Data in the Federal government: What is the most interesting thing about Big Data? Who is driving Big Data adoption at agencies? What should IT keep in mind as they look to deliver Big Data solutions?
Big data has less to do with size and more to do with the growing recognition that data and analysis have a seemingly limitless potential for improving government and society. But data alone does not deliver value. Real value is created when government can bring together data – big or traditional – from multiple sources or locations, and present that information in a way that encourages exploration and insight. Qlik allows you to extend big data analytics to the edges of your agency. Read Qlik’s case study to learn more.