OpenUp Blog

Flint SPARQL Editor

Up to now, when submitting SPARQL queries via a web page, users are generally presented with a bland text box (on our site and everyone else’s). There is none of the help that we typically have in IDEs (such as Eclipse) for editing code. When we edit code we have come to expect syntax highlighting and on-the-fly error checking. Marijn Haverbeke’s excellent CodeMirror javascript library has gone much of the way to solving this. In version 1.0 of CodeMirror, there was a provided mode for syntax highlighting for SPARQL. However, this doesn’t flag up errors, so users don’t find out about any mistakes in their query until they get a message back from the server.

At TSO, we are thinking about how far we could go with in-browser tools to support editing of the SPARQL queries. We have implemented a version of SPARQL highlighting in CodeMirror2.0, and also hooked in a full javascript parser for SPARQL 1.0 (built using David Majda’s excellent PEG.js parser generator). This enables not just highlighting, but full syntax checking as you type. The client-side code also detects what type of query you have written (select, construct, describe or ask), and enables the appropriate options for choosing output format. We think this gives a much smoother ride than current web-based SPARQL editors.

April 18, 2011, 6:33 pm

How to Procure Data Services

Procuring open data services is now easy for government!

Please tell your procurement team that The National Archives and Buying Solutions (part of the Efficiency and Reform Group within the Cabinet Office) have worked in partnership to develop and deliver a framework agreement for Digital Continuity which includes many open data services and solutions.

Buying Solutions estimates that using Frameworks saves 77 days on average in the procurement process and eliminates the costs associated with undertaking an OJEU procurement.

The aim of the framework is to help the public sector organisations maximise efficiency and value for money in their procurement activities.

Here is Buying Solutions' own description:

  • Buying Solutions enters into Framework Agreements with suppliers so that purchasers in the public sector may obtain value for money (VFM) whilst complying with UK and EU Legislation
  • The evaluation criteria used in each competition are designed to award Framework Agreements to suppliers submitting Most Economically Advantageous Tender (MEAT), taking into account price, quality, capacity and track record
  • Using Framework Agreements saves time and money for purchasers and ensures that terms and conditions of their Contract are robust and comply with best practice

It provides:

April 4, 2011, 12:02 pm

Open Data Experts Award OpenUp Prize

On 23 March 2011, we hosted the culmination of the OpenUp contest, which has been running for the last year to find the best ideas for the use of public sector data.

We were delighted to have our distinguished panel of judges in attendance: Professor Nigel Shadbolt, Charles Arthur, Lucian J Hudson, Emma Mulqueeny, Ashley Freidlein and Robin Brattel as well as eminent guests including former Director of Digital Engagement, Andrew Stott. 

As our expert panel of judges were interviewing the four finalists, some of those instrumental in opening up government data spoke on the challenges and benefits to a seminar audience.

John Sheridan, Head of Legislation Services for The National Archives talked about how was created on open data principles to deliver better service more efficiently.

Jeni Tennison, a specialist consultant in XML-related and linked data, talked about the importance of URIs and naming things well.

Paul Davidson, CIO of Sedgemoor District Council talked about the challenges being faced by local government in making its data transparent.

March 25, 2011, 10:44 am

Our Thoughts on Provenance

It’s all very well providing data on our servers, but the validity and trustworthiness of the data depends on its provenance: Where did it originate? When? Did it come from a trusted source? Who checked it? When? What processing has been applied to it? Has it been processed by something which could have introduced errors? What configurations or parameters were used when the processes were run?

This challenge of representing provenance is an important issue for linked data. Typically, provenance information is represented as a graph, which links together the artifacts (e.g. web pages, documents, scripts), the processes (e.g. execution of programs), and agents (e.g. people, services) which are involved. 

The data on our servers is now collected through our data harvester. As each process for collection and transformation of data is applied within the pipeline architecture used by the harvester, it’s easy for us to keep track of where we got it and what we did to it en route. As each document passes through a data harvester pipeline, it carries with it a provenance graph. The provenance graph is extended at each processing step, and is then stored in the RDF store alongside the data it describes. In this way, we are able to consistently record thorough and accurate provenance information.

February 3, 2011, 12:16 pm

Introducing the OpenUp Client

Introducing the OpenUp Client – Our Latest Addition to OpenUpLabs

TSO has been working with many different technologies and standards as part of our OpenUp offering. Recently we released OpenUpLabs as a way of showing some of the more experimental work that we have been doing and the OpenUp Client is our latest addition.

The OpenUp Client is in its very early stages as an experiment to try and bring together many of those technologies as a demonstration of what can be achieved by combining information extraction and linked data. In addition, the client only becomes possible because of recent developments in Cross-Origin Resource Sharing (CORS) -

What is the OpenUp Client?

The Client is a bookmarklet, basically a link you can drag onto your bookmarks bar in your browser and then, once in a web page, click on the link to start the Client. It should be noted that the Client currently only works in Firefox, Safari and Chrome. Opera does not appear to support CORS and Internet Explorer needs a more involved approach which we are yet to look at.

February 1, 2011, 9:52 am

Welcome to OpenUp

We’re passionate about opening up government data and seeing it used in innovative ways.

At TSO we’ve been working with government data for more than 200 years, first in print, then online and now as open linked data. Our OpenUp website is about transforming data into more useful formats and engaging developers with the results to create new, more innovative websites and apps.

Here we’ll be blogging about the latest releases of our data services, new data sets available in linked data formats and pointing you towards interesting uses of government data. 

To start, take a look at our Data Enrichment Service which transforms text into linked data in one easy step.

And if you have a great idea for using government data, you can enter your idea into our OpenUp contest to win £1,000 cash and a development fund of up to £50k to see your idea brought to life.

If you have feedback or ideas of what you would like to see on the OpenUp site email

October 21, 2010, 11:04 am