||U.S.: Big Data Brings Big Skillset Gaps
||Monday, October 29, 2012
ICT for MDGs, Citizen Engagement
||Nov 02, 2012
Opportunities in big data, but also big challenges for agencies hoping to get up to speed. In a recent poll of 151 federal IT professionals, MeriTalk found agencies have less than half of the storage, computing and manpower they would need to leverage big data – with human capital being the most difficult to secure.
Addressing big data challenges starts with a solid strategy, said Micheline Casey, former chief data officer for Colorado and principal at CDO, LLC. Questions to think about include: What is the desired objective? What are the problems or opportunities that should be pursued?
The next move, she said, would be to boil down everything to a couple of use cases to tackle and dive in to play with the data and the technologies. However, not all big data technologies are created equal and will not be appropriate for every use case, Casey cautioned.
“Real-time analysis of sensor or cyber data is different from doing large-scale batch analytics for predictive or prescriptive analytical purposes,” she said. “This is where the data scientists can really be diving into the data to see what insights they’re able to pull out of it.”
To successfully manage big data, agencies should also consider setting up a steering committee or governance board with agency members, suggested Bethann Pepoli, CTO of the public sector and education at EMC Corporation.
For starters: Add just a few outcomes to measure and ensure you have the right agency involvement, she said.
“Once the appropriate governance is established, which in some cases can take time to line up, it will be important to acquire the skill sets needed to create the structure or framework for how the data needed to report on the outcome can be quickly searched and assimilated with other pieces of data to create insight or turn the data into something meaningful,” said Pepoli, who recently served as commissioner on the TechAmerica Big Data Commission and contributed to the report “Demystifying Big Data: A Practical Guide to Transforming the Business of Government.”
But as big data in government continue to evolve, there will be a greater need to hire for more specialized roles than the more traditional data-manager jobs. However, while CDOs and data scientist roles will be important down the line, plenty can be achieved with existing people, Pepoli said.
“To start, it is important to think about and define what you want to measure,” she said. “That short list of outcomes will drive the skill sets needed to ensure that the outcomes can be measured and reported on.”
For example, establishing good governance up-front with clear goals will be key, she said “Existing data management resources could identify where all of the data sits today, then the hard work begins of ensuring data sharing and formalizing the data structures,” Pepoli said. “Additional skills will be needed at that point.”
Although it is always beneficial for federal agencies to attract and retain professionals with the skillsets necessary to meet current and future technology challenges, agencies do not need to assume that burden on their own, said Ray Muslimani, president and CEO at GCE.
“Agencies are already seeing benefits from tapping big data technologies delivered through a cloud-service provider, an approach that places the technology and skillset responsibilities on the provider, rather than the agency alone,” he said.
How we work and live today has created an environment where data exists in many disparate locations, and “unstructured content continues to grow at an unbelievable pace,” Pepoli said.
“CDO positions will absolutely be needed and we are seeing many government entities moving in this direction,” she said, citing Casey, who served as Colorado’s first CDO, and the more recent appointment of Brett Goldstein as Chicago’s CDO.
The ability to analyze data – particularly unstructured data— requires agencies to recruit data scientists, and those skills are in short demand, said Mark Weber, president of the U.S. public sector at NetApp.
“In order to address this challenge, the government will likely have to use a multipronged approach, competing with the private sector for talent, using contractors and also hiring young talent and developing them on their own,” he said.
The trend to hire more data scientists in the private sector will be an emphais for hiring in the public sector as well, said Christine Meyers, senior product marketing manager at Attachmate.
Rich datasets provide new opportunities to "see" patterns and activities previously hidden, and transitioning from managing data to using it to provide actionable insights will be critical moving forward, she stressed.
“We're winding up the baseball season, so Billy Beane is on my mind -- he pioneered an example of how an industry was turned on its head overnight as he helped bring a science of decision making to the sport,” Meyers said. “I see a similar opportunity within the federal government. Data speaks volumes. We need new approaches and new technologies to help us hear what that data is telling us.”
The future will see greater specialization as data-centric roles emerge, including analysts with particular expertise in one or more types of data, including cyber, sensor and video, Weber said.
“One of the interesting things about big data is that you don’t know what you are going to find until you start looking for it,” he said. “This is an exciting development for IT professionals, giving them an opportunity to have a greater impact on the mission of their organization, especially as more routine IT processes become ever more enmeshed in the day-to-day operations of the government.”