By Dr. Norman Jacknis, ICF Senior Fellow, April 22, 2015
The original article may be found at http://njacknis.tumblr.com/post/117084058588/is-open-data-good-enough
Last week, on April 16th, the Knowledge Society Forum of the Eurocities group held its Beyond Data event in Eindhoven, the Netherlands. The members of the KSF consists out of more than 50 policy makers focused on Open Data, from Europe. They were joined by many other open data experts and advocates.
I led off with the keynote presentation. The theme was simple: we need to go beyond merely opening (i.e., releasing) public data and there are a variety of new technologies that will make the Open Data movement more useful to the general public.
Since I was speaking in my role as Senior Fellow of the Intelligent Community Forum (ICF), I drew a parallel between that work and the current status of Open Data. I pointed out that ICF has emphasized that an “intelligent city” is much more than a “smart city” with technology controlling its infrastructure. What makes a community intelligent is if and how it uses that technology foundation to improve the experience of living there.
Similarly, to make the open data movement relevant to citizens, we need to go beyond merely releasing public data. Even Hackathons and the encouragement of app developers has its limits in part because developers in private companies will try to find some way to monetize their work, but not all useful public problems have profit potential.
To create this value means focusing on data of importance to people (not just what’s easy to deliver), undertaking data analytics, following up with actions that have real impact on policies and programs and especially, engaging citizen in every step of the open data initiative.
I pointed out how future technology trends will improve every city’s use of its data in three ways:
1. Data collection, integration and quality
2. Visualization, anywhere it is needed
3. Analytics of the data to improve public policies and programs
For example, the inclusion of social data (like sentiment analysis) and the Internet of Things can be combined with data already collected by the government to paint a much richer picture of what is going on in a city. In addition to drones, iBeacon, visual analyzers (like Placemeter), there are now also inexpensive, often open source, sensor devices that the public can purchase and use for more data collection.
Of course, all this data needs a different kind of management than businesses have used in the past. So I pointed out NoSQL database management systems and Dat for real time data flow. Some of the most interesting analytics is based on the merger of data from multiple sources, which poses additional difficulties that are beginning to be overcome through linked data and the new geospatial extension of the semantic web, GeoSPARQL.
If this data – and the results of its analysis – are to be useful, especially in real time, then data visualization needs to be everywhere. That includes using augmented reality and even projecting results on surfaces, much like TransitScreen does.
And if all this data is to be useful, it must be analyzed so I discussed the key role of predictive analytics in going beyond merely releasing data. But I emphasized the way that residents of a city can help in this task and cited the many people already involved in Zooniverse. There are even tools to help people overcome their statistical immaturity, as you can see on Public Health Ontario.
Finally, the data can also be used by people to help envision – or re-envision – their cities through tools like Betaville.
Public officials have to go beyond merely congratulating themselves on being transparent by releasing data. They need to take advantage of these technological developments and shift their focus to making the data useful to their residents – all in the overriding goal of improving the quality of life for their residents.